About me
Hi, I am a Ph.D student in System Engineering at Boston University, supervised by Prof. Francesco Orabona. I am a member of OPTIMAL Lab.
My research interests lie in theoretical machine learning and stochastic optimization. I currently work on understanding and designing optimization methods in machine learning, specifically, stochastic gradient descent and its variants, and adaptive gradient descent methods.
I received my Bachelor degree in Math and Applied Math from University of Science and Technology of China.
Publications
On the Last Iterate Convergence of Momentum Methods.
Xiaoyu Li, Mingrui Liu, Francesco Orabona. ALT 2022. PaperA Second look at Exponential and Cosine Step Sizes: Simplicity, Convergence, and Performance.
Xiaoyu Li, Zhenxun Zhuang, Francesco Orabona. ICML 2021. Paper CodeA High Probability Analysis of Adaptive SGD with Momentum.
Xiaoyu Li, Francesco Orabona. ICML 2020 Workshop on Beyond First Order Methods in ML Systems. Paper VideoOn the Convergence of Stochastic Gradient Descent with Adaptive Stepsizes.
Xiaoyu Li, Francesco Orabona. AISTATS 2019. Paper
Experiences
Work
- Applied Scientist Intern @ Amazon, Remote. June - Aug. 2021
- Research Intern @ Nokia Bell Labs, Murray Hill, NJ. June - Aug. 2019
Academic Service
Reviewer of NeurIPS2019 & 2020; AISTATS2020-2022; ICLR 2021, ICML2020 & 2021