Home // ACHI 2025, The Eighteenth International Conference on Advances in Computer-Human Interactions // View article
CoMeSy: Multimodal Interaction with a Situated Cobot for Collaborative Tasks
Authors:
Sven Milde
Alexander Jost
Rainer Blum
Jan-Torsten Milde
Marius Schultheis
Johannes Weyel
Tobias Müller
Thies Beinke
Niklas Schreiner
Julian Heumüller
Dennis Möller
Frank Hartman
Keywords: Multimodal interaction; Human-robot interaction (HRI); Collaborative robotics; Cobot as an intelligent assistant
Abstract:
The CoMeSy project is developing a system for multimodal interaction between humans and cobots, where the cobot acts as an intelligent assistant. The system uses speech and gestures as input, and responds with speech, sounds, actions, and visual feedback. A key challenge is dynamically creating action plans based on human input, world knowledge, and visual perception. The system integrates several technologies, including speech recognition and synthesis, image processing, object detection, hand tracking, and acoustic feedback. Currently in development, the project aims to address intelligent communication, situational understanding, dynamic planning, reactive behavior, and robust handling of interruptions, with plans for empirical evaluation
Pages: 20 to 24
Copyright: Copyright (c) IARIA, 2025
Publication date: May 18, 2025
Published in: conference
ISSN: 2308-4138
ISBN: 978-1-68558-268-5
Location: Nice, France
Dates: from May 18, 2025 to May 22, 2025