New optimization algorithms for neural network training using operator splitting techniques

Abstract

We present a new type of optimization algorithms, adapted for neural network training. These algorithms are based upon sequential operator splitting technique for some associated dynamical systems. Furthermore, we investigate through numerical simulations the empirical rate of convergence of these iterative schemes toward a local minimum of the loss function, with some suitable choices of the underlying hyper parameters. We validate the convergence of these optimizers using the results of the accuracy and of the loss function on the MNIST,MNIST-Fashion and CIFAR 10 classification datasets.

Authors

Cristian Daniel Alecsa
Department of Mathematics Babes-Bolyai University, Cluj-Napoca, Romania
Tiberiu Popoviciu Institute of Numerical Analysis Romanian Academy Cluj-Napoca, Romania

Titus Pinta
Department of Mathematics, Babes-Bolyai University, Cluj-Napoca, Romania

Imre Boros
Department of Mathematics Babes-Bolyai University Cluj-Napoca, Romania
Tiberiu Popoviciu Institute of Numerical Analysis Romanian Academy Cluj-Napoca, Romania

Keywords

unconstrained optimization problems; splitting; neural network training;

Paper coordinates

Cristian-Daniel Alecsa, Titus Pinta, Imre Boros, New optimization algorithms for neural network training using operator splitting techniques, https://arxiv.org/pdf/1904.12952.pdf

PDF

About this paper

Journal
Publisher Name
DOI
Print ISSN
Online ISSN

google scholar link

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.

Menu