THE REGULARIZED OPERATOR EXTRAPOLATION ALGORITHM

  • V. V. Semenov Faculty of Computer Science and Cybernetics, Taras Shevchenko National University of Kyiv, Kyiv, Ukraine
  • O. S. Kharkov Faculty of Computer Science and Cybernetics, Taras Shevchenko National University of Kyiv, Kyiv, Ukraine
Keywords: variational inequality, monotone operator, operator extrapolation algorithm, Halpern method

Abstract

This work is devoted to the study of new algorithm for solving variational inequalities in Hilbert spaces. The proposed algorithm is a variant of the operator extrapolation method regularized using the Halpern scheme. The algorithm has an advantage over the Korpelevich extragradient method and the method of extrapolation from the past in terms of the amount of calculations required for the iterative step. For variational inequalities with monotone, Lipschitz continuous operators acting in Hilbert space, a theorem on strong convergence of the method is proved.

References

Nagurney A. Network economics: A variational inequality approach. Dordrecht: Kluwer Academic Publishers, 1999. 325 p.

Kinderlehrer D. Stampacchia G. An introduction to variational inequalities and their applications. New York: Academic Press, 1980. Russian transl., Moscow: Mir, 1983. 256 p.

Bakushinskii A. B., Goncharskii A. V. Iterative Methods for Solving Ill-Posed Problems. Moscow: Nauka, 1989. 126 p.

Facchinei F., Pang J.-S. Finite-Dimensional Variational Inequalities and Complementarily Problem. V. 2. New York: Springer, 2003. 666 p.

Alber Y., Ryazantseva I. Nonlinear Ill Posed Problems of Monotone Type. Dordrecht: Springer, 2006. 410 p.

Bauschke H. H., Combettes P. L. Convex Analysis and Monotone Operator Theory in Hilbert Spaces. Berlin-Heidelberg-New York: Springer, 2011. 408 p.

Semenov V. V. Variational inequalities: theory and algorithms. Kyiv: "Kyiv University", 2021. 167 p.

Nemirovski A. Prox-method with rate of convergence O(1/T) for variational inequalities with Lipschitz continuous monotone operators and smooth convex concave saddle point problems. SIAM J. on Optim. 2004. Vol. 15. P. 229–251.

Wang J.-K., Abernethy J., Levy K. Y. No-Regret Dynamics in the Fenchel Game: A Unified Framework for Algorithmic Convex Optimization. arXiv preprint arXiv:2111.11309. 2021.

Gidel G., Berard H., Vincent P., Lacoste-Julien S. A Variational Inequality Perspective on Generative Adversarial Networks. preprint arXiv:1802.10551. 2018.

Korpelevich G. M. An extragradient method for finding saddle points and for other problems. Matecon. 1976. Vol. 12. No. 4. P. 747–756.

Nadezhkina N., Takahashi W. Weak convergence theorem by an extragradient method for nonexpansive mappings and monotone mappings. Journal of Optimization Theory and Applications. 2006. Vol. 128. P. 191–201.

Verlan D. A., Semenov V. V., Chabak L. M. A Strongly Convergent Modified Extragradient Method for Variational Inequalities with Non-Lipschitz Operators. Journal of Automation and Information Sciences. 2015. Vol. 47. Issue 7. P. 31–46. https://doi.org/10.1615/JAutomatInfScien.v47.i7.40.

Denisov S. V., Nomirovskii D. A., Rublyov B. V., Semenov V. V. Convergence of Extragradient Algorithm with Monotone Step Size Strategy for Variational Inequalities and Operator Equations. Journal of Automation and Information Sciences. 2019. Vol. 51. Iss. 6. P. 12–24. https://doi.org/10.1615/JAutomatInfScien.v51.i6.20.

Tseng P. A modified forward-backward splitting method for maximal monotone mappings. SIAM Journal on Control and Optimization. 2000. Vol. 38. P. 431–446.

Censor Y., Gibali A., Reich S. The subgradient extragradient method for solving variational inequalities in Hilbert space. Journal of Optimization Theory and Applications. 2011. Vol. 148. P. 318–335. https://doi.org/10.1007/s10957-010-9757-3.

Lyashko S. I., Semenov V. V., Voitova T. A. Low-cost modification of Korpelevich’s methods for monotone equilibrium problems. Cybernetics and Systems Analysis. 2011. V. 47. Iss. 4. P. 631–639. https://doi.org/10.1007/s10559-011-9343-1.

Semenov V. V. Modified Extragradient Method with Bregman Divergence for Variational Inequalities. Journal of Automation and Information Sciences. 2018. Vol. 50. Issue 8. P. 26–37. https://doi.org/10.1615/JAutomatInfScien.v50.i8.30.

Bach F., Levy K. Y. A Universal Algorithm for Variational Inequalities Adaptive to Smoothness and Noise. arXiv preprint arXiv :1902.01637. 2019.

Popov L. D. A modification of the Arrow-Hurwicz method for search of saddle points. Mathematical notes of the Academy of Sciences of the USSR. 1980. Vol. 28. Issue 5. P. 845–848.

Vedel Y. I., Sandrakov G. V., Semenov V. V. An Adaptive Two Stage Proximal Algorithm for Equilibrium Problems in Hadamard Spaces. Cybernetics and Systems Analysis. 2020. Vol. 56. Issue 6. P. 978–989. https://doi.org/10.1007/s10559-020-00318-6.

Vedel Y. I., Denisov S. V., Semenov V. V. An Adaptive Algorithm for the Variational Inequality Over the Set of Solutions of the Equilibrium Problem. Cybernetics and Systems Analysis. 2021. Vol. 57. Issue 1. P. 91–100. https://doi.org/10.1007/s10559-021-00332-2.

Semenov V. V., Denisov S. V., Kravets A. V. Adaptive Two-Stage Bregman Method for Variational Inequalities. Cybernetics and Systems Analysis. 2021. Vol. 57. Issue 6. P. 959–967. https://doi.org/10.1007/s10559-021-00421-2.

Malitsky Yu. V., Semenov V. V. An Extragradient Algorithm for Monotone Variational Inequalities. Cybernetics and Systems Analysis. 2014. Vol. 50. Issue 2. P. 271–277. https://doi.org/10.1007/s10559-014-9614-8.

Chabak L., Semenov V., Vedel Y. A New Non-Euclidean Proximal Method for Equilibrium Problems. In: Chertov O., Mylovanov T., Kondratenko Y., Kacprzyk J., Kreinovich V., Stefanuk V. (eds.) Recent Developments in Data Science and Intelligent Analysis of Information. ICDSIAI 2018. Advances in Intelligent Systems and Computing, vol. 836. Springer, Cham, 2019. P. 50–58. https://doi.org/10.1007/978-3-319-97885-7_6.

Vedel Y. I., Sandrakov G. V., Semenov V. V., Chabak L. M. Convergence of a Two-Stage Proximal Algorithm for the Equilibrium Problem in Hadamard Spaces. Cybernetics and Systems Analysis. 2020. V. 56. Iss. 5. P. 784–792. https://doi.org/10.1007/s10559-020-00299-6.

Malitsky Y., Tam M. K. A Forward-Backward Splitting Method for Monotone Inclusions Without Cocoercivity. SIAM J. on Optim. 2020. Vol. 30. P. 1451–1472. https://doi.org/10.1137/18M1207260.

Semenov V. V., Denisov S. V., Sandrakov G. V., Kharkov O. S. Convergence of the Operator Extrapolation Method for Variational Inequalities in Banach Spaces. Cybernetics and Systems Analysis. 2022. Vol. 58. Issue 6. P. 740–753. https://doi.org/10.1007/s10559-022-00507-5.

Vedel Y., Semenov V., Denisov S. A Novel Algorithm with Self-adaptive Technique for Solving Variational Inequalities in Banach Spaces. In: Olenev N. N., Evtushenko Y. G., Ja´cimovi´c M., Khachay M., Malkova V. (eds.) Advances in Optimization and Applications. OPTIMA 2021. Communications in Computer and Information Science, vol 1514. Springer, Cham, 2021. P. 50–64.

Iiduka H., Takahashi W. Weak convergence of a projection algorithm for variational inequalities in a Banach space. Journal of Mathematical Analysis and Applications. 2008. Vol. 339. No. 1. P. 668–679. https://doi.org/10.1016/j.jmaa.2007.07.019.

Shehu Y. Single projection algorithm for variational inequalities in Banach spaces with application to contact problem. Acta Math. Sci. 2020. 40. P. 1045–1063. https://doi.org/10.1007/s10473-020-0412-2.

Yang J., Cholamjiak P., Sunthrayuth P. Modified Tseng’s splitting algorithms for the sum of two monotone operators in Banach spaces. AIMS Mathematics. 2021. Vol. 6. Iss. 5. P. 4873–4900. https://doi.org/10.3934/math.2021286.

Halpern B. Fixed points of nonexpanding maps. Bull. Amer. Math. Soc. 1967. Vol. 73. P. 957–961.

Mainge P.-E. Strong convergence of projected subgradient methods for nonsmooth and nonstrictly convex minimization. Set-Valued Analysis. 2008. Vol. 16. P. 899–912.

Gorbunov E., Loizou N., Gidel G. Extragradient Method: O (1/K) Last Iterate Convergence for Monotone Variational Inequalities and Connections With Cocoercivity. arXiv preprint arXiv: 2110.04261. 2021.

Yoon T., Ryu E. K. Accelerated algorithms for smooth convex-concave minimax problems with O(1/k^2) rate on squared gradient norm. Proceedings of the 38th International Conference on Machine Learning. Proceedings of Machine Learning Research. 2021. Vol. 139. P. 12098–12109

Published
2023-08-11
How to Cite
Semenov, V., & Kharkov, O. (2023). THE REGULARIZED OPERATOR EXTRAPOLATION ALGORITHM. Journal of Numerical and Applied Mathematics, (1), 15-27. https://doi.org/10.17721/2706-9699.2023.1.02