A bregman forward-backward linesearch algorithm for nonconvex composite optimization: Superlinear convergence to nonisolated local minima

Masoud Ahookhosh, Andreas Themelis, Panagiotis Patrinos

Research output: Contribution to journalArticlepeer-review

13 Citations (Scopus)

Abstract

We introduce Bella, a locally superlinearly convergent Bregman forward-backward splitting method for minimizing the sum of two nonconvex functions, one of which satisfies a relative smoothness condition and the other one is possibly nonsmooth. A key tool of our methodology is the Bregman forward-backward envelope (BFBE), an exact and continuous penalty function with favorable first- and second-order properties, which enjoys a nonlinear error bound when the objective function satisfies a Lojasiewicz-type property. The proposed algorithm is of linesearch type over the BFBE along user-defined update directions and converges subsequentially to stationary points and globally under the Kurdyka-Lojasiewicz condition. Moreover, when the update directions are superlinear in the sense of Facchinei and Pang [Finite-Dimensional Variational Inequalities and Complementarity Problems, Volume I, Springer, New York, 2003], owing to the given nonlinear error bound unit stepsize is eventually always accepted and the algorithm attains superlinear convergence rates even when the limit point is a nonisolated minimum.

Original languageEnglish
Pages (from-to)653-685
Number of pages33
JournalSIAM Journal on Optimization
Volume31
Issue number1
DOIs
Publication statusPublished - Feb 2021

All Science Journal Classification (ASJC) codes

  • Software
  • Theoretical Computer Science
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'A bregman forward-backward linesearch algorithm for nonconvex composite optimization: Superlinear convergence to nonisolated local minima'. Together they form a unique fingerprint.

Cite this