报告7:On the Global R-linear Convergence of NAG Method and Beyond
2024/10/21 来源: 编辑:


报告人:包承龙 (清华大学)


报告题目:On the Global R-linear Convergence of NAG Method and Beyond


摘要:The Nesterov Accelerated Gradient (NAG) method is a widely-used extrapolation-based algorithm that accelerates the convergence of the gradient descent method in convex optimization problems. In this talk, we will explore the global linear convergence properties of the NAG method when applied to strongly convex functions by assuming that the extrapolation coefficient is independent of the strong convexity parameter. Moreover, we will provide a mathematical analysis that demonstrates the advantages of restart schemes in the NAG method. Finally, we will compare these results with the continuous understanding of the NAG method from the perspective of ordinary differential equations.