About Me
Hi! I am Zixuan Wang. I am a second-year Ph.D. student in Electrical and Computer Engineering at Princeton University. I am fortunate to be advised by Prof. Jason D. Lee.
My research interests broadly lie in understanding the foundations of LLMs and deep learning, as well as leveraging both theoretical and empirical explorations to advance the frontiers of deep learning and language modeling.
I did my undergraduate in Institutes for Interdisciplinary Information Sciences (IIIS), Tsinghua University (Known as “Yao class”).
You can contact me at wangzx (at) princeton (dot) edu. My CV is here!
Selected Publications
Scaling Latent Reasoning via Looped Language Models
Rui-Jie Zhu$^∗$, Zixuan Wang$^∗$, Kai Hua$^∗$, Tianyu Zhang$^∗$, et al.Learning Compositional Functions with Transformers from Easy-to-Hard Data (COLT 2025)
Zixuan Wang$^∗$, Eshaan Nichani$^∗$, A. Bietti, A. Damian, Daniel Hsu, Jason Lee, Denny WuWhat Makes a Reward Model a Good Teacher? An Optimization Perspective (NeurIPS 2025, Spotlight)
Noam Razin, Zixuan Wang, Hubert Strauss, Stanley Wei, Jason D. Lee, Sanjeev AroraTransformers Learn to Implement Multi-step Gradient Descent with Chain of Thought (ICLR 2025, Spotlight)
Jianhao Huang$^∗$, Zixuan Wang$^*$, Jason D. Lee (Mentored Jianhao in this project)Transformers Provably Learn Sparse Token Selection While Fully-Connected Nets Cannot (ICML 2024)
Zixuan Wang, Ruocheng Wei, Daniel Hsu, Jason D. LeeUnderstanding Edge-of-Stability Training Dynamics With a Minimalist Example (ICLR 2023)
Xingyu Zhu$^∗$, Zixuan Wang$^*$, Xiang Wang, Mo Zhou, Rong Ge.Analyzing Sharpness along GD Trajectory: Progressive Sharpening and Edge of Stability (NeurIPS 2022)
Zhouzi Li$^∗$, Zixuan Wang$^*$, Jian Li.
