Home // eKNOW 2020, The Twelfth International Conference on Information, Process, and Knowledge Management // View article
Acceleration Technique of Two-Phase Quasi-Newton Method with Momentum for Optimization Problems
Authors:
Sudeera Hasaranga Gunathilaka Mastiyage Don
Shahrzad Mahboubi
Hiroshi Ninomiya
Keywords: neural networks; training algorithm; Two-Phase Quasi-Newton method; Nesterov’s accelerated gradient; momentum terms.
Abstract:
This paper describes a novel acceleration technique of the Two-Phase Quasi-Newton method using momentum terms for optimization problems. The performance of the proposed algorithm is evaluated on an unconstrained optimization problem in neural network training. The results show that the proposed algorithm has a much faster convergence than the conventional Two-Phase Quasi-Newton method.
Pages: 59 to 61
Copyright: Copyright (c) IARIA, 2020
Publication date: March 22, 2020
Published in: conference
ISSN: 2308-4375
ISBN: 978-1-61208-765-8
Location: Valencia, Spain
Dates: from November 21, 2020 to November 25, 2020