TY - JOUR
T1 - SuperMann
T2 - A Superlinearly Convergent Algorithm for Finding Fixed Points of Nonexpansive Operators
AU - Themelis, Andreas
AU - Patrinos, Panagiotis
N1 - Funding Information:
Manuscript received March 7, 2018; accepted January 27, 2019. Date of publication March 27, 2019; date of current version December 3, 2019. This work was supported in part by FWO research projects G086518N and G086318N; in part by Fonds de la Recherche Scientifique—FNRS and in part by the Fonds Wetenschappelijk Onderzoek—Vlaanderen under EOS Project 30468160; and in part by KU Leuven—Internal Funding C14/18/068. Recommended by Associate Editor Y. Tan. (Corresponding author: Andreas Themelis.) A. Themelis is with ESAT-STADIUS, Katholieke Universiteit Leuven, Leuven 3000, Belgium (e-mail:,andreas.themelis@kuleuven.be).
Publisher Copyright:
© 1963-2012 IEEE.
PY - 2019/12
Y1 - 2019/12
N2 - Operator splitting techniques have recently gained popularity in convex optimization problems arising in various control fields. Being fixed-point iterations of nonexpansive (NE) operators, such methods suffer many well known downsides, which include high sensitivity to ill conditioning and parameter selection, and consequent low accuracy and robustness. As universal solution we propose SuperMann, a Newton-type algorithm for finding fixed points of NE operators. It generalizes the classical Krasnosel'skii-Mann scheme, enjoys its favorable global convergence properties and requires exactly the same oracle. It is based on a novel separating hyperplane projection tailored for nonexpansive mappings which makes it possible to include steps along any direction. In particular, when the directions satisfy a Dennis-Moré condition we show that SuperMann converges superlinearly under mild assumptions, which, surprisingly, do not entail nonsingularity of the Jacobian at the solution but merely metric subregularity. As a result, SuperMann enhances and robustifies all operator splitting schemes for structured convex optimization, overcoming their well known sensitivity to ill conditioning.
AB - Operator splitting techniques have recently gained popularity in convex optimization problems arising in various control fields. Being fixed-point iterations of nonexpansive (NE) operators, such methods suffer many well known downsides, which include high sensitivity to ill conditioning and parameter selection, and consequent low accuracy and robustness. As universal solution we propose SuperMann, a Newton-type algorithm for finding fixed points of NE operators. It generalizes the classical Krasnosel'skii-Mann scheme, enjoys its favorable global convergence properties and requires exactly the same oracle. It is based on a novel separating hyperplane projection tailored for nonexpansive mappings which makes it possible to include steps along any direction. In particular, when the directions satisfy a Dennis-Moré condition we show that SuperMann converges superlinearly under mild assumptions, which, surprisingly, do not entail nonsingularity of the Jacobian at the solution but merely metric subregularity. As a result, SuperMann enhances and robustifies all operator splitting schemes for structured convex optimization, overcoming their well known sensitivity to ill conditioning.
UR - http://www.scopus.com/inward/record.url?scp=85077393207&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85077393207&partnerID=8YFLogxK
U2 - 10.1109/TAC.2019.2906393
DO - 10.1109/TAC.2019.2906393
M3 - Article
AN - SCOPUS:85077393207
SN - 0018-9286
VL - 64
SP - 4875
EP - 4890
JO - IEEE Transactions on Automatic Control
JF - IEEE Transactions on Automatic Control
IS - 12
M1 - 8675506
ER -