Home // International Journal On Advances in Intelligent Systems, volume 2, numbers 2 and 3, 2009 // View article
Multimodal Robot/Human Interaction for Assisted Living
Authors:
Ray Jarvis
Om Gupta
Sutono Effendi
Zhi Li
Keywords: Intelligent robotics, assistive technology, scene analysis, robot navigation, gesture recognition, human/machine interaction.
Abstract:
This paper outlines the framework of a complex system to demonstrate multimodal spatial and transactional intelligence in a robot which autonomously supports aged, frail, or otherwise disabled people in a domestic assistive technology context. The intention is that the robot be able to navigate around a known multi-room environment along optimal, collision-free paths in search and retrieval of requested objects such as spectacles, books etc. and must also be capable of tracking and following humans and of reminding them of times for meals, medication etc. and to lead disoriented subjects to their meal place at appropriate times and even dispense medication, if necessary. The modes of communication interchanges with the supported human include spoken speech and gestures (including eye gaze direction) within the context of situational analysis which accommodates recent history, temporal factors and individual user behavioural models. This paper provides an overview of an ambitious research project in its early stages, describes many components developed to date and outlines future work.
Pages: 288 to 302
Copyright: Copyright (c) to authors, 2009. Used with permission.
Publication date: December 1, 2009
Published in: journal
ISSN: 1942-2679