Home // DBKDA 2023, The Fifteenth International Conference on Advances in Databases, Knowledge, and Data Applications // View article


Robust Representations in Deep Learning

Authors:
Shu Liu
Qiang Wu

Keywords: deep neural network, LSTM, correntropy loss, robustness

Abstract:
Deep neural networks are playing increasing roles in machine learning and artificial intelligence to handle complicated data. The performance of deep neural networks depends highly on the network architecture and the loss function. While the most common choice for loss function is the squared loss for regression analysis it is known to be sensitive to outliers and adversarial samples. To improve the robustness, we introduce the use of the correntropy loss to the implementation of deep neural networks. We further split the neural network architecture into a feature extraction component and function evaluation component and design four two-stage algorithms to study which component is more impacted by the use of the robust loss. The applications in several real data sets indicates that the robust deep neural networks can efficiently generate robust representations of complicated data and the two-stage algorithms are consistently more powerful than their one-stage counterparts.

Pages: 27 to 32

Copyright: Copyright (c) IARIA, 2023

Publication date: March 13, 2023

Published in: conference

ISSN: 2308-4332

ISBN: 978-1-68558-056-8

Location: Barcelona, Spain

Dates: from March 13, 2023 to March 17, 2023