报告人:苏文藻 (香港中文大学)
报告题目:Universal Gradient Descent Ascent Method for Smooth Minimax Optimization
摘要:Smooth minimax optimization has attracted much attention over the past decade. Considerable research has focused on tailored algorithms for smooth minimax problems, addressing specific structural conditions like convexity of primal function/ concavity of dual function and Polyak-Lojasiewicz (PL)/ Kurdyka-Lojasiewicz (KL) conditions. However, verifying these conditions is challenging in practice. This motivates our pursuit of universal algorithms for smooth minimax problems. In this talk, we present a novel universally applicable single-loop algorithm, the doubly smoothed optimistic gradient descent ascent method (DS-OGDA). With the same set of parameters, DS-OGDA can be applied to convex-concave, nonconvex-concave, convex-nonconcave, nonconvex-KL, and KL-nonconcave scenarios. Moreover, by exploiting structural information, DS-OGDA can achieve the optimal or best-known iteration complexity result for each of the said scenarios.