Show simple item record

dc.contributor.authorBartlett, ME
dc.contributor.authorEdmunds, CER
dc.contributor.authorBelpaeme, T
dc.contributor.authorThill, S
dc.contributor.authorLemaignan, S
dc.date.accessioned2019-09-02T11:41:41Z
dc.date.available2019-09-02T11:41:41Z
dc.date.issued2019-06-26
dc.identifier.issn2296-9144
dc.identifier.issn2296-9144
dc.identifier.otherARTN 49
dc.identifier.urihttp://hdl.handle.net/10026.1/14853
dc.description.abstract

In recent years, the field of Human-Robot Interaction (HRI) has seen an increasing demand for technologies that can recognize and adapt to human behaviors and internal states (e.g., emotions and intentions). Psychological research suggests that human movements are important for inferring internal states. There is, however, a need to better understand what kind of information can be extracted from movement data, particularly in unconstrained, natural interactions. The present study examines which internal states and social constructs humans identify from movement in naturalistic social interactions. Participants either viewed clips of the full scene or processed versions of it displaying 2D positional data. Then, they were asked to fill out questionnaires assessing their social perception of the viewed material. We analyzed whether the full scene clips were more informative than the 2D positional data clips. First, we calculated the inter-rater agreement between participants in both conditions. Then, we employed machine learning classifiers to predict the internal states of the individuals in the videos based on the ratings obtained. Although we found a higher inter-rater agreement for full scenes compared to positional data, the level of agreement in the latter case was still above chance, thus demonstrating that the internal states and social constructs under study were identifiable in both conditions. A factor analysis run on participants' responses showed that participants identified the constructs interaction imbalance, interaction valence and engagement regardless of video condition. The machine learning classifiers achieved a similar performance in both conditions, again supporting the idea that movement alone carries relevant information. Overall, our results suggest it is reasonable to expect a machine learning algorithm, and consequently a robot, to successfully decode and classify a range of internal states and social constructs using low-dimensional data (such as the movements and poses of observed individuals) as input.

dc.format.extent49-
dc.format.mediumElectronic-eCollection
dc.languageeng
dc.language.isoen
dc.publisherFrontiers Media
dc.relation.urihttps://freeplay-sandbox.github.io
dc.relation.urihttps://github.com/severin-lemaignan/pinsoro-kinematics-study/blob/master/analysis/analyses_notebook.ipynb.
dc.relation.urihttps://github.com/severin-lemaignan/pinsoro-kinematics-study/
dc.subjectsocial psychology
dc.subjecthuman-robot interaction
dc.subjectmachine learning
dc.subjectsocial interaction
dc.subjectrecognition
dc.titleWhat Can You See? Identifying Cues on Internal States From the Movements of Natural Social Interactions
dc.typejournal-article
dc.typeArticle
plymouth.author-urlhttps://www.ncbi.nlm.nih.gov/pubmed/33501065
plymouth.issueJUN
plymouth.volume6
plymouth.publisher-urlhttp://dx.doi.org/10.3389/frobt.2019.00049
plymouth.publication-statusPublished online
plymouth.journalFrontiers in Robotics and AI
dc.identifier.doi10.3389/frobt.2019.00049
plymouth.organisational-group/Plymouth
plymouth.organisational-group/Plymouth/Faculty of Health
plymouth.organisational-group/Plymouth/Faculty of Science and Engineering
plymouth.organisational-group/Plymouth/REF 2021 Researchers by UoA
plymouth.organisational-group/Plymouth/REF 2021 Researchers by UoA/UoA11 Computer Science and Informatics
plymouth.organisational-group/Plymouth/Research Groups
plymouth.organisational-group/Plymouth/Research Groups/Marine Institute
plymouth.organisational-group/Plymouth/Users by role
dc.publisher.placeSwitzerland
dcterms.dateAccepted2019-06-06
dc.rights.embargodate2019-12-20
dc.identifier.eissn2296-9144
dc.rights.embargoperiodNot known
rioxxterms.versionVersion of Record
rioxxterms.versionofrecord10.3389/frobt.2019.00049
rioxxterms.licenseref.urihttp://www.rioxx.net/licenses/all-rights-reserved
rioxxterms.licenseref.startdate2019-06-26
rioxxterms.typeJournal Article/Review


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record


All items in PEARL are protected by copyright law.
Author manuscripts deposited to comply with open access mandates are made available in accordance with publisher policies. Please cite only the published version using the details provided on the item record or document. In the absence of an open licence (e.g. Creative Commons), permissions for further reuse of content should be sought from the publisher or author.
Theme by 
Atmire NV