Home // ACHI 2011, The Fourth International Conference on Advances in Computer-Human Interactions // View article
Towards Automated Human-Robot Mutual Gaze
Authors:
Frank Broz
Hatice Kose-Bagci
Chrystopher L. Nehaniv
Kerstin Dautenhahn
Keywords: mutual gaze; human-robot interaction; psychology; Markov model
Abstract:
The role of gaze in interaction has been an area of increasing interest to the field of human-robot interaction. Mutual gaze, the pattern of behavior that arises when humans look directly at each other's faces, sends important social cues communicating attention and personality traits and helping to regulate conversational turn-taking. In preparation for learning a computational model of mutual gaze that can be used as a controller for a robot, data from human-human pairs in a conversational task was collected using a gaze-tracking system and face-tracking algorithm. The overall amount of mutual gaze observed between pairs agreed with predictions from the psychology literature. But the duration of mutual gaze was shorter than predicted, and the amount of direct eye contact detected was, surprisingly, almost nonexistent. The results presented show the potential of this automated method to capture detailed information about human gaze behavior, and future applications for interaction-based robot language learning are discussed. The analysis of human-human mutual gaze using automated tracking allows further testing and extension of past results that relied on hand-coding and can provide both a method of data collection and input for control of interactive robots.
Pages: 222 to 227
Copyright: Copyright (c) IARIA, 2011
Publication date: February 23, 2011
Published in: conference
ISSN: 2308-4138
ISBN: 978-1-61208-117-5
Location: Gosier, Guadeloupe, France
Dates: from February 23, 2011 to February 28, 2011