A gradient-type algorithm with backward inertial steps associated to a nonconvex minimization problem

Abstract

We investigate an algorithm of gradient type with a backward inertial step in connection with the minimization of a nonconvex differentiable function. We show that the generated sequences converge to a critical point of the objective function, if a regularization of the objective function satisfies the Kurdyka-Łojasiewicz property. Further, we provide convergence rates for the generated sequences and the objective function values formulated in terms of the Łojasiewicz exponent. Finally, some numerical experiments are presented in order to compare our numerical scheme with some algorithms well known in the literature.

Authors

Cristian Daniel Alecsa
Tiberiu Popoviciu Institute of Numerical Analysis, Romanian Academy
Department of Mathematics, Babes-Bolyai University, Cluj-Napoca, Romania

Szilárd Csaba László
Technical University of Cluj-Napoca, Romania

Adrian Viorel
Technical University of Cluj-Napoca, Romania

 

 

Keywords

Inertial algorithm; Nonconvex optimization; Kurdyka-Łojasiewicz inequality; Convergence rate

Paper coordinates

Cristian Daniel Alecsa, Szilárd Csaba László, Adrian Viorel, A gradient-type algorithm with backward inertial steps associated to a nonconvex minimization problem, Numer. Algor., 84 (2020), pp. 485–512.
doi: 10.1007/s11075-019-00765-z

PDF

About this paper

Print ISSN
Online ISSN

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.

Menu