Home // ICAS 2025, The Twenty-First International Conference on Autonomic and Autonomous Systems // View article
Authors:
Diego Navarro Tellez
Ezio Malis
Raphael Antoine
Philipe Martinet
Keywords: robotics, autonomous vehicles, vision and scene understanding, volumetric image representation
Abstract:
In the context of smart structure maintenance, the positioning of data measurements (e.d. radar, thermal camera, etc..) is crucial. Most of this data is meant to be collected in the proximity of the structure. In this paper, we address the challenge of accurately localizing a drone within GPS-deprived environments. This issue arises particularly near large structures, where the GPS signal can be significantly distorted or entirely absent. One of the most common solutions to the problem is to use a vision sensor and a Simultaneous Localization and Mapping (SLAM) system to reconstruct the environment and localize the drone. However, existing SLAM approaches may not be robust and precise enough, especially when the cameras lose perspective due to the proximity of the structure. In this paper, we propose a novel framework that computes a dense map of the environment to exploit it using direct odometry which excels in precise localization. The main contribution of this paper is the use of dense maps that enable localization in scenarios with narrow perspective. Experiments in realistic simulated environments demonstrate the system's capability to localize the drone within 16 centimeter accuracy and to outperform existing State Of The Art approaches.
Pages: 35 to 40
Copyright: Copyright (c) IARIA, 2025
Publication date: March 9, 2025
Published in: conference
ISSN: 2308-3913
ISBN: 978-1-68558-241-8
Location: Lisbon, Portugal
Dates: from March 9, 2025 to March 13, 2025