Home // BRAININFO 2021, The Sixth International Conference on Neuroscience and Cognitive Brain Information // View article


Decoding Imagined Auditory Pitch Phenomena with an Autoencoder Based Temporal Convolutional Architecture

Authors:
Sean Paulsen
Lloyd May
Michael Casey

Keywords: neuroimaging; neuroscience; auditory cognition; deep learning

Abstract:
Stimulus decoding of functional Magnetic Resonance Imaging (fMRI) data with machine learning models has provided new insights about neural representational spaces and task-related dynamics. However, the scarcity of labelled (task-related) fMRI data is a persistent obstacle, resulting in model-underfitting and poor generalization. In this work, we mitigated data poverty by extending a recent pattern-encoding strategy from the visual memory domain to our own domain of auditory pitch tasks, which to our knowledge had not been done. Specifically, extracting preliminary information about participants’ neural activation dynamics from the textit{unlabelled} fMRI data resulted in improved downstream classifier performance when decoding heard and imagined pitch. Our results demonstrate the benefits of leveraging unlabelled fMRI data against data poverty for decoding pitch based tasks, and yields novel significant evidence for both separate and overlapping pathways of heard and imagined pitch processing, deepening our understanding of auditory cognitive neuroscience.

Pages: 17 to 22

Copyright: Copyright (c) IARIA, 2021

Publication date: July 18, 2021

Published in: conference

ISSN: 2519-8653

ISBN: 978-1-61208-885-3

Location: Nice, France

Dates: from July 18, 2021 to July 22, 2021