Research Fellow (Relative smooth optimization theory)
Job Description
In recent decades, we have witnessed significant progresses in the convergence and complexity theory of the first-order optimization methods, with gradient global Lipschitz continuity (GGLC) assumption playing a central role, in many classical results. However, a large class of important problems arising in modern optimization and machine learning do not satisfy this assumption. As a result, there remains a substantial gap between the theory and practical behavior of many widely used algorithms.
This project, led by Dr. Zhang, aims to strengthen the theoretical foundation of relative smooth optimization, an emerging framework developed to go beyond the classical GGLC setting. In particular, the project will study first-order methods under relative smoothness, with a focus on nonconvex problems, more appropriate optimality measures, and new non-Euclidean Lipschitz tools that better capture the underlying problem geometry. The goal is to establish sharper convergence and complexity results, clarify several widely adopted but potentially misleading arguments in the current literature, and develop a more reliable and powerful new analysis framework for the relative smooth problem class.
Job Requirements
Interested applicants are required to possess a PhD in 2026. He/she should have a good understanding in
- convergence and complexity analysis for (nonconvex) optimization algorithms
- variational inequalities and duality theory
- stochastic process and martingale theory
- semi-algebraic and subanalytic geometry
- stochastic approximation methods
- dynamical systems
The applicant should also be experienced in MATLAB and Python coding. In particular, he/she should have the ability to adapt base codes of PyTorch to implement new algorithms instead of calling built-in functions.
In addition, experience in GPU-based acceleration of large-scale algorithms will be an advantage. Familiarity with implementing or adapting first-order methods on GPU platforms, as well as handling large-scale matrix-vector computations efficiently, is preferred.
Find Your Best Opportunity
Tell them AcademicJobs.com sent you!
