报告人:江如俊 (复旦大学)
报告题目:Accelerated Gradient Descent by Concatenation of Stepsize Schedules
摘要:This talk considers stepsize schedules for gradient descent on smooth convex objectives. We extend the existing literature and propose a unified approach to construct stepsizes with analytic bounds for arbitrary iterations. Our approach constructs new stepsize schedules by concatenating two short stepsize schedules. Using this approach, we introduce two new families of stepsize schedules, achieving a convergence rate of $O(n^{-1.2716})$ with a state-of-the-art constant on objective values or gradient norms. Furthermore, our analytically derived stepsize schedules match the numerically-computed globally optimal stepsize schedules.