A gradient-type algorithm with backward inertial steps associated to a nonconvex minimization problem

Abstract

We investigate an algorithm of gradient type with a backward inertial step in connection with the minimization of a nonconvex differentiable function. We show that the generated sequences converge to a critical point of the objective function, if a regularization of the objective function satisfies the Kurdyka-Łojasiewicz property. Further, we provide convergence rates for the generated sequences and the objective function values formulated in terms of the Łojasiewicz exponent. Finally, some numerical experiments are presented in order to compare our numerical scheme with some algorithms well known in the literature.

Authors

Cristian Daniel Alecsa
Tiberiu Popoviciu Institute of Numerical Analysis, Romanian Academy
Department of Mathematics, Babes-Bolyai University, Cluj-Napoca, Romania

Szilárd Csaba László
Technical University of Cluj-Napoca, Romania

Adrian Viorel
Technical University of Cluj-Napoca, Romania

Keywords

Optimization; Unconstrained optimization problems; Inertial algorithm; Nonconvex optimization; Kurdyka-Łojasiewicz inequality; Convergence rate; Discrete Lyapunov energy functions.

Paper coordinates

Cristian Daniel Alecsa, Szilárd Csaba László, Adrian Viorel, A gradient-type algorithm with backward inertial steps associated to a nonconvex minimization problem, Numer. Algor., 84 (2020), pp. 485–512.
doi: 10.1007/s11075-019-00765-z

PDF

About this paper

Print ISSN
Online ISSN

[1] H. Attouch, J. Bolte, On the convergence of the proximal algorithm for nonsmooth functions in-volving analytic features, Mathematical Programming 116(1-2) Series B, 5-16, 2009

[2] H. Attouch, J. Bolte, P. Redont, A. Soubeyran, Proximal alternating minimization and projec-tion methods for nonconvex problems: an approach based on the Kurdyka- Lojasiewicz inequality,Mathematics of Operations Research 35(2), 438-457, 2010

[3] H. Attouch, J. Bolte, B.F. Svaiter, Convergence of descent methods for semi-algebraic and tameproblems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods, Mathematical Programming 137(1-2) Series A, 91-129, 2013

[4] H. Attouch, Z. Chbani, J. Peypouquet, P. Redont, Fast convergence of inertial dynamics andalgorithms with asymptotic vanishing viscosity, Math. Program. 168(1-2) Ser. B, 123-175, 2018

[5] P. B´egout, J. Bolte, M.A. Jendoubi, On damped second-order gradient systems, Journal of Differ-ential Equations (259), 3115-3143, 2015

[6] J. Bolte, S. Sabach, M. Teboulle, Proximal alternating linearized minimization for nonconvex andnonsmooth problems, Mathematical Programming Series A (146)(1-2), 459-494, 2014

[7] J. Bolte, A. Daniilidis, A. Lewis, The Lojasiewicz inequality for nonsmooth subanalytic functionswith applications to subgradient dynamical systems, SIAM Journal on Optimization 17(4), 1205-1223, 2006

[8] J. Bolte, A. Daniilidis, A. Lewis, M. Shiota, Clarke subgradients of stratifiable functions, SIAMJournal on Optimization 18(2), 556-572, 2007

[9] J. Bolte, A. Daniilidis, O. Ley, L. Mazet, Characterizations of Lojasiewicz inequalities: subgradientflows, talweg, convexity, Transactions of the American Mathematical Society 362(6), 3319-3363,2010

[10] R.I. Bot¸, E.R. Csetnek, S.C. Laszlo, Approaching nonsmooth nonconvex minimization throughsecond-order proximal-gradient dynamical systems, Journal of Evolution Equations 18(3), 1291-1318, 2018

[11] R.I. Bot¸, E.R. Csetnek, S.C. Laszlo, An inertial forward-backward algorithm for minimizing thesum of two non-convex functions, Euro Journal on Computational Optimization 4(1), 3-25, 2016

[12] R.I. Bot¸, E.R. Csetnek, S.C. Laszlo, A second order dynamical approach withvariable damping to nonconvex smooth minimization, Applicable Analysis (2018),https://doi.org/10.1080/00036811.2018.1495330

[13] P. Frankel, G. Garrigos, J. Peypouquet, Splitting Methods with Variable Metric forKurdyka Lojasiewicz Functions and General Convergence Rates, Journal of Optimization Theoryand Applications, 165(3), 874900, 2015

[14] K. Kurdyka, On gradients of functions definable in o-minimal structures, Annales de l’institutFourier (Grenoble) 48(3), 769-783, 1998

[15] S.C. Laszlo, Convergence rates for an inertial algorithm of gradient type associated to a smoothnonconvex minimization, https://arxiv.org/abs/1807.0038721

[16] G. Li, T. K. Pong, Calculus of the Exponent of Kurdyka- Lojasiewicz Inequality and Its Applicationsto Linear Convergence of First-Order Methods, Foundations of Computational Mathematics, 1-34, 2018

[17] S. Lojasiewicz, Une propriete topologique des sous-ensembles analytiques reels, Les Equations auxDerivees Partielles, Editions du Centre National de la Recherche Scientifique Paris, 87-89, 1963

[18] Y.E. Nesterov, A method for solving the convex programming problem with convergence rateO(1/k2), (Russian) Dokl. Akad. Nauk SSSR 269(3), 543-547, 1983

[19] Y. Nesterov, Introductory lectures on convex optimization: a basic course. Kluwer Academic Pub-lishers, Dordrecht, 2004

[20] B. T. Polyak, Some methods of speeding up the convergence of iteration methods, U.S.S.R. Comput.Math. Math. Phys., 4(5):1-17, 1964

[21] R.T. Rockafellar, R.J.-B. Wets, Variational Analysis, Fundamental Principles of Mathematical Sciences 317, Springer-Verlag, Berlin, 1998

[22] W. Su, S. Boyd, E.J. Candes, A differential equation for modeling Nesterov’s accelerated gradientmethod: theory and insights, Journal of Machine Learning Research, 17, 1-43, 2016

2020

Related Posts