Home // ACHI 2014, The Seventh International Conference on Advances in Computer-Human Interactions // View article


Nonintrusive Multimodal Attention Detection

Authors:
Hugo Jiawei Sun
Michael Xuelin Huang
Grace Ngai
Stephen Chi Fai Chan

Keywords: Affective computing; keystroke dynamics; facial expression; multimodal recognition; attention detection

Abstract:
With the increasing deployment of computers in a wide variety of applications, the ability to detect the user’s attention, or engagement, is becoming more important as a key piece of contextual information in building effective interactive systems. For instance, one can imagine that a system that is aware of whether the user is attending to it would be able to adapt itself better to the user activities to enhance productivity. The ability to detect attention would also be useful for system analysis in designing and building better systems. However, much previous work in attention detection is either obtrusive or imposes demanding constraints on the context and the participants. In addition, most approaches rely on uni-modal signals, which are often limited in availability and stability. This paper attempts to address these two major limitations through a noninvasive multimodal solution, which allows participants to work naturally without interference. The solution makes use of common off-the-shelf items that could reasonably be expected of any computing environment and does not rely on expensive and tailor-made equipment. Using a three-class attention state setting, it achieves average accuracy rates of 59.63% to 77.81%; the best result being 77.81% for a general searching task, which shows 11.9% improvement over the baseline. We also analyze and discuss the contribution by individual features to different models.

Pages: 192 to 199

Copyright: Copyright (c) IARIA, 2014

Publication date: March 23, 2014

Published in: conference

ISSN: 2308-4138

ISBN: 978-1-61208-325-4

Location: Barcelona, Spain

Dates: from March 23, 2014 to March 27, 2014