Home // FUTURE COMPUTING 2018, The Tenth International Conference on Future Computational Technologies and Applications // View article
Authors:
Shahrzad Mahboubi
Hiroshi Ninomiya
Keywords: Limited-memory quasi-Newton method; Nesterov’s accelerated gradient method; neural networks; training algorithm
Abstract:
This paper describes a novel training algorithm based on Limited-memory quasi-Newton method (LQN) with Nesterov's accelerated gradient for faster training of neural networks.
Pages: 1 to 3
Copyright: Copyright (c) IARIA, 2018
Publication date: February 18, 2018
Published in: conference
ISSN: 2308-3735
ISBN: 978-1-61208-608-8
Location: Barcelona, Spain
Dates: from February 18, 2018 to February 22, 2018