TY - JOUR
T1 - An interior proximal gradient method for nonconvex optimization
AU - De Marchi, Alberto
AU - Themelis, Andreas
N1 - Publisher Copyright:
© The author(s), 2024.
PY - 2024
Y1 - 2024
N2 - We consider structured minimization problems subject to smooth inequality constraints and present a flexible algorithm that combines interior point (IP) and proximal gradient schemes. While traditional IP methods cannot cope with nonsmooth objective functions and proximal algorithms cannot handle complicated constraints, their combined usage is shown to successfully compensate the respective shortcomings. We provide a theoretical characterization of the algorithm and its asymptotic properties, deriving convergence results for fully nonconvex problems, thus bridging the gap with previous works that successfully addressed the convex case. Our interior proximal gradient algorithm benefits from warm starting, generates strictly feasible iterates with decreasing objective value, and returns after finitely many iterations a primal-dual pair approximately satisfying suitable optimality conditions. As a byproduct of our analysis of proximal gradient iterations we demonstrate that a slight refinement of traditional backtracking techniques waives the need for upper bounding the stepsize sequence, as required in existing results for the nonconvex setting.
AB - We consider structured minimization problems subject to smooth inequality constraints and present a flexible algorithm that combines interior point (IP) and proximal gradient schemes. While traditional IP methods cannot cope with nonsmooth objective functions and proximal algorithms cannot handle complicated constraints, their combined usage is shown to successfully compensate the respective shortcomings. We provide a theoretical characterization of the algorithm and its asymptotic properties, deriving convergence results for fully nonconvex problems, thus bridging the gap with previous works that successfully addressed the convex case. Our interior proximal gradient algorithm benefits from warm starting, generates strictly feasible iterates with decreasing objective value, and returns after finitely many iterations a primal-dual pair approximately satisfying suitable optimality conditions. As a byproduct of our analysis of proximal gradient iterations we demonstrate that a slight refinement of traditional backtracking techniques waives the need for upper bounding the stepsize sequence, as required in existing results for the nonconvex setting.
KW - interior point methods
KW - locally Lipschitz gradient
KW - Nonsmooth nonconvex optimization
KW - proximal algorithms
UR - http://www.scopus.com/inward/record.url?scp=85198117863&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85198117863&partnerID=8YFLogxK
U2 - 10.5802/ojmo.30
DO - 10.5802/ojmo.30
M3 - Article
AN - SCOPUS:85198117863
SN - 2777-5860
VL - 5
JO - Open Journal of Mathematical Optimization
JF - Open Journal of Mathematical Optimization
ER -