About Myself
I am a Quantitative Researcher at Citadel Securities. Previously, I got my Ph.D. degree from Stanford ICME, where I was fortunate to be advised by Professor Tengyu Ma. My research interest lies in machine learning theory, in particular Federated Learning, Optimization and Deep Learning theory. Before Stanford, I graduated from Peking University with B.S. degrees in Computational Mathematics and Computer Science.
Please find my CV and Google Scholar here.
Publications
Plex: Towards Reliability using Pretrained Large Model Extensions
with Dustin Tran et al.
arXiv: 2207.07411 | bibtex | code | blogOn Principled Local Optimization Methods for Federated Learning
Ph.D. Thesis, Stanford University, 2022Big-Step-Little-Step: Efficient Gradient Methods for Objectives with Multiple Scales
with Jonathan Kelner, Annie Marsden, Vatsal Sharan, Aaron Sidford, Gregory Valiant (alphabetical order)
COLT 2022 | bibtex | codeSharp Bounds for Federated Averaging (Local SGD) and Continuous Perspective
Margalit Glasgow*, Honglin Yuan*, Tengyu Ma (*equal contribution)
AISTATS 2022 | bibtex | codeWhat Do We Mean by Generalization in Federated Learning?
Honglin Yuan, Warren Morningstar, Lin Ning, Karan Singhal
ICLR 2022 | bibtex | codeA Field Guide to Federated Optimization
with Jianyu Wang et al.
arXiv:2107.06917 | bibtex | codeFederated Composite Optimization
Honglin Yuan, Manzil Zaheer, Sashank Reddi
ICML 2021 | bibtex | talk video at FLOW seminar | slides | poster | codeFederated Accelerated Stochastic Gradient Descent
Honglin Yuan, Tengyu Ma
NeurIPS 2020 (Best Paper in FL-ICML’20 workshop) | bibtex | video (3 min) | poster | codeGlobal Optimization with Orthogonality Constraints via Stochastic Diffusion on Manifold
Honglin Yuan, Xiaoyi Gu, Rongjie Lai, Zaiwen Wen
Journal of Scientific Computing, 2019 | bibtex | slides