Home // UBICOMM 2021, The Fifteenth International Conference on Mobile Ubiquitous Computing, Systems, Services and Technologies // View article


Sounds Real: Using Hardware Accelerated Real-time Ray-Tracing for Augmenting Location Dependent Audio Samples

Authors:
Alexander Madzar
Linfeng Li
Hymalai Bello
Bo Zhou
Paul Lukowicz

Keywords: audio synthesis; ray-tracing; Unreal Engine; VRWorks audio.

Abstract:
We present a data augmentation technique for generating location variant audio samples using ray-traced audio in virtual recreations of the real world. Hardware Audio-Based Location-Aware Systems are capable of locating audio sources in relation to mobile devices. Relevant technique in the context of location-based and person tracking in ubiquitous environments. However, this solution is limited in collecting vast data to train the machine learning model reliably. To overcome this problem, we constructed a virtual environment using the audio ray- tracing solution, NVidia VRWorks Audio in Unreal Engine 4, to simulate a real-world setting. The environmental sounds in the real-world scenario were imported into the virtual environment. This strategy could augment data for training Hardware Audio- Based Location-Aware Systems machine learning models with the necessary calibration of the unreal and real data sets. Our results show the audio ray-tracing framework could simulate real-world sound in the virtual environment to a certain extent.

Pages: 14 to 19

Copyright: Copyright (c) IARIA, 2021

Publication date: October 3, 2021

Published in: conference

ISSN: 2308-4278

ISBN: 978-1-61208-886-0

Location: Barcelona, Spain

Dates: from October 3, 2021 to October 7, 2021