Perturbation resilience of proximal gradient algorithm for composite objectives
-
2039
Downloads
-
3880
Views
Authors
Yanni Guo
- College of Science, Civil Aviation University of China, Tianjin, 300300, P. R. China.
Wei Cui
- College of Science, Civil Aviation University of China, Tianjin, 300300, P. R. China.
Yansha Guo
- School of Information Technology Engineering, Tianjin University of Technology and Education, Tianjin, 300222, P. R. China.
Abstract
In this paper, we study the perturbation resilience of a proximal gradient algorithm
under the general Hilbert space setting. With the assumption that the error sequence
is summable, we prove that the iterative sequence converges weakly to a solution of the composite
optimization problem. We also show the bounded perturbation resilience of this iterative
method and apply it to the lasso problem.
Share and Cite
ISRP Style
Yanni Guo, Wei Cui, Yansha Guo, Perturbation resilience of proximal gradient algorithm for composite objectives, Journal of Nonlinear Sciences and Applications, 10 (2017), no. 10, 5566--5575
AMA Style
Guo Yanni, Cui Wei, Guo Yansha, Perturbation resilience of proximal gradient algorithm for composite objectives. J. Nonlinear Sci. Appl. (2017); 10(10):5566--5575
Chicago/Turabian Style
Guo, Yanni, Cui, Wei, Guo, Yansha. "Perturbation resilience of proximal gradient algorithm for composite objectives." Journal of Nonlinear Sciences and Applications, 10, no. 10 (2017): 5566--5575
Keywords
- Perturbation resilience
- proximal gradient algorithm
- composite optimization
- lasso problem
MSC
References
-
[1]
F. Alvarez, H. Attouch, An inertial proximal method for maximal monotone operators via discretization of a nonlinear oscillator with damping, Wellposedness in optimization and related topics, Gargnano, (1999), Set-Valued Anal., 9 (2001), 3–11.
-
[2]
J. B. Baillon, G. Haddad , Quelques propriétés des opérateurs angle-bornés et n-cycliquement monotones, (French) Israel J. Math., 26 (1977), 137–150.
-
[3]
H. H. Bauschke, P. L. Combettes, Convex analysis and monotone operator theory in Hilbert spaces, With a foreword by Hédy Attouch, CMS Books in Mathematics/Ouvrages de Mathématiques de la SMC, Springer, New York (2011)
-
[4]
Y. Censor, R. Davidi, G. T. Herman, Perturbation resilience and superiorization of iterative algorithms, Inverse Problems, 26 (2010), 12 pages.
-
[5]
Y. Censor, R. Davidi, G. T. Herman, R. W. Schulte, L. Tetruashvili , Projected subgradient minimization versus superiorization, J. Optim. Theory Appl., 160 (2014), 730–747.
-
[6]
Y. Censor, A. J. Zaslavski, Convergence and perturbation resilience of dynamic string-averaging projection methods, Comput. Optim. Appl., 54 (2013), 65–76.
-
[7]
Y. C. Cheng , On the gradient-projection method for solving the nonsymmetric linear complementarity problem, J. Optim. Theory Appl., 43 (1984), 527–541.
-
[8]
R. Davidi, Y. Censor, R. W. Schulte, S. Geneser, L. Xing, Feasibility-seeking and superiorization algorithms applied to inverse treatment planning in radiation therapy, Infinite products of operators and their applications, Contemp. Math., Israel Math. Conf. Proc., Amer. Math. Soc., Providence, RI, 636 (2015), 83–92.
-
[9]
E. S. Helou, M. V. W. Zibetti, E. X. Miqueles, Superiorization of incremental optimization algorithms for statistical tomographic image reconstruction, Inverse Problems, 33 (2017), 26 pages.
-
[10]
W.-M. Jin, Y. Censor, M. Jiang, Bounded perturbation resilience of projected scaled gradient methods, Comput. Optim. Appl., 63 (2016), 365–392.
-
[11]
A. Kaplan, R. Tichatschke, On inexact generalized proximal methods with a weakened error tolerance criterion, Optimization, 63 (2004), 3–17.
-
[12]
P.-L. Lions, B. Mercier , Splitting algorithms for the sum of two nonlinear operators, SIAM J. Numer. Anal., 16 (1979), 964–979.
-
[13]
Z.-Q. Luo, P. Tseng, Error bounds and convergence analysis of feasible descent methods: a general approach, Degeneracy in optimization problems, Ann. Oper. Res., 46/47 (1993), 157–178.
-
[14]
J. J. Moreau , Proximité et dualité dans un espace hilbertien, (French) Bull. Soc. Math. France, 93 (1965), 273–299.
-
[15]
Y. Nesterov, Gradient methods for minimizing composite objective function , CORE Discussion Papers, 140 (2007), 125– 161.
-
[16]
T. Nikazad, R. Davidi, G. T. Herman, Accelerated perturbation-resilient block-iterative projection methods with application to image reconstruction , Inverse Problems, 28 (2012), 19 pages.
-
[17]
G. B. Passty, Ergodic convergence to a zero of the sum of monotone operators in Hilbert space, J. Math. Anal. Appl., 72 (1979), 383–390.
-
[18]
B. T. Polyak , Introduction to optimization, Translated from the Russian. With a foreword by Dimitri P. Bertsekas, Translations Series in Mathematics and Engineering, Optimization Software, Inc., Publications Division, New York (1987)
-
[19]
S. A. Santos, R. C. M. Silva, An inexact and nonmonotone proximal method for smooth unconstrained minimization , J. Comput. Appl. Math., 269 (2014), 86–100.
-
[20]
M. J. Schrapp, G. T. Herman, Data fusion in X-ray computed tomography using a superiorization approach, Rev. Sci. Instrum., 85 (2014), 9 pages.
-
[21]
H.-K. Xu , Properties and iterative methods for the lasso and its variants, Chin. Ann. Math. Ser. B, 35 (2014), 501–518.
-
[22]
H.-K. Xu, Bounded perturbation resilience and superiorization techniques for the projected scaled gradient method, Inverse Problems, 33 (2017), 19 pages.
-
[23]
Y.-H. Yao, N. Shahzad, Strong convergence of a proximal point algorithm with general errors, Optim. Lett., 6 (2012), 621–628.
-
[24]
A. J. Zaslavski, Convergence of a proximal point method in the presence of computational errors in Hilbert spaces, SIAM J. Optim., 20 (2010), 2413–2421.