Embedded multimodal nonverbal and verbal interactions between a mobile toy robot and autistic children

Irini Giannopulu*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review

8 Citations (Scopus)

Abstract

We studied the multimodal nonverbal and verbal relationship between autistic children and a mobile toy robot during free spontaneous game play. A range of cognitive nonverbal criteria including eye contact, touch, manipulation, and posture were analyzed; the frequency of the words and verbs was calculated. Embedded multimodal interactions of autistic children and a mobile toy robot suggest that this robot could be used as a neural orthesis in order to improve children's brain activity and incite child to express language.

Original languageEnglish
Title of host publicationHRI 2013 - Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction
Place of PublicationUntied States
PublisherWiley-IEEE Press
Pages127-128
Number of pages2
ISBN (Print)9781467330558
DOIs
Publication statusPublished - 2013
Externally publishedYes
Event8th ACM/IEEE International Conference on Human-Robot Interaction, HRI 2013 - Tokyo, Tokyo, Japan
Duration: 3 Mar 20136 Mar 2013

Conference

Conference8th ACM/IEEE International Conference on Human-Robot Interaction, HRI 2013
CountryJapan
CityTokyo
Period3/03/136/03/13

    Fingerprint

Cite this

Giannopulu, I. (2013). Embedded multimodal nonverbal and verbal interactions between a mobile toy robot and autistic children. In HRI 2013 - Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction (pp. 127-128). [6483534] Untied States: Wiley-IEEE Press. https://doi.org/10.1109/HRI.2013.6483534