• A. Yu. Shavlyuk Faculty of Computer Science and Cybernetics, Taras Shevchenko National University of Kyiv, Kyiv, Ukraine
  • V. V. Semenov Faculty of Computer Science and Cybernetics, Taras Shevchenko National University of Kyiv, Kyiv, Ukraine
Keywords: convexity, optimization, gradient system, convergence, Lyapunov function


The asymptotic behavior of the gradient system, which is a continuous analogue of the variant of the gradient method from [16] for the minimization of strongly convex functions, is studied. Using the Lyapunov analysis, estimates of the rate of convergence of the gradient system were established.


Polyak B. T. Gradient methods for the minimisation of functionals. USSR Computational Mathematics and Mathematical Physics. 1963. Volume 3. Issue 4. P. 864–878.

Alber S. I., Alber Ya. I. The method of differential descent for the solution of multi-dimensional variational problems. Dokl. Akad. Nauk SSSR. 1966. Volume 171. Number 6. P. 1247–1250.

Alber S. I., Alber Ya. I. A method of differential descent for solving non-linear systems. USSR Computational Mathematics and Mathematical Physics. 1967. Volume 7. Issue 1. P. 15–40.

Goudou X., Munier J. The gradient and heavy ball with friction dynamical systems: the quasiconvex case. Math. Program. 2009. Vol. 116. P. 173–191.

Polyak B. T., Shcherbakov P. S. Optimisation and asymptotic stability. International Journal of Control. 2018. Vol. 91. No. 11. P. 2404–2410.

Polyak B. T. Some methods of speeding up the convergence of iteration methods. USSR Computational Mathematics and Mathematical Physics. 1964. Volume 4. Issue 5. P. 1–17.

Alvarez F. On the Minimizing Property of a Second Order Dissipative System in Hilbert Spaces. SIAM Journal on Control and Optimization. 2000. Vol. 38. Iss. 4. P. 1102–1119.

Alvarez F., Attouch H., Bolte J., Redont P. A second-order gradient-like dissipative dynamical system with Hessian-driven damping: Application to optimization and mechanics. Journal de Mathematiques Pures et Appliquees. 2002. Volume 81. Issue 8. P. 747–779.

Su W., Stephen Boyd S., Candes E. J. A Differential Equation for Modeling Nesterov’s Accelerated Gradient Method: Theory and Insights. Journal of Machine Learning Research. 2016. Vol. 17. P. 1–43.

Nesterov Y. Introductory lectures on convex optimization: A basic course. Boston: Kluwer Academic Publishers, 2004. 254 p.

Attouch H., Fadili J. From the Ravine method to the Nesterov method and vice versa: a dynamical system perspective. arXiv:2201.11643. 2022.

Alber Y. I. Continuous regularization of linear operator equations in a Hilbert space. Mathematical Notes of the Academy of Sciences of the USSR. 1968. 4. P. 793–797.

Alber Ya. I. Continuous processes of Newton type. Differ. Uravn. 1971. Volume 7. Number 11. P. 1931–1945.

Suh J. J., Roh G., Ryu E. K. Continuous-Time Analysis of Accelerated Gradient Methods via Conservation Laws in Dilated Coordinate Systems. Proceedings of the 39th International Conference on Machine Learning. 2022. P. 20640–20667.

Attouch H., Chbani Z., Fadili J., Riahi H. Fast Convergence of Dynamical ADMM via Time Scaling of Damped Inertial Dynamics. J. Optim. Theory Appl. 2022. Vol. 193. P. 704–736.

Fatkhullin I., Polyak B. Optimizing Static Linear Feedback: Gradient Method. arXiv: 2004.09875. 2020.

Krasnoselskii M. A., Vainikko G. M., Zabreiko P. P., Rutitcki Ja. B., Stecenko V. Ja. Approximated Solutions of Operator Equations. Groningen: Walters–Noordhoff, 1972. 484 p.