Tianhao Wang (王天浩)

6045 South Kenwood Ave
Chicago, IL 60637
tianhao.wang@ttic.edu
I am a Research Assistant Professor in the Toyota Technological Institute at Chicago. I am broadly interested in various aspects of machine learning, optimization, and statistics.
Prior to TTIC, I received my Ph.D. from the Department of Statistics and Data Science at Yale University, where I was fortunate to be advised by Zhou Fan. I obtained my Bachelor’s degree in mathematics with a dual degree in computer science at the University of Science and Technology of China.
In 2025, I will join the Halıcıoğlu Data Science Institute at UC San Diego as a tenure-track Assistant Professor.
I have PhD openings for Fall 2025. If you are interested, please apply to the Data Science PhD program in the Halıcıoğlu Data Science Institute at UC San Diego and mention my name in your application. Feel free to reach out to me via email.
CVRecent papers(*: equal contribution)
- Can Neural Networks Achieve Optimal Computational-statistical Tradeoff? An Analysis on Single-Index ModelIn International Conference on Learning Representations (ICLR), 2025Presented at NeurIPS 2024 Workshop on Mathematics of Modern Machine Learning
- How well can Transformers emulate in-context Newton’s method?In International Conference on Artificial Intelligence and Statistics (AISTATS), 2025Presented at ICLR 2024 Workshop on Bridging the Gap Between Practice and Theory in Deep Learning
- Unveiling Induction Heads: Provable Training Dynamics and Feature Learning in TransformersIn Advances in Neural Information Processing Systems (NeurIPS), 2024Presented at ICML 2024 Workshop on Theoretical Foundations of Foundation Models
- Implicit regularization of gradient flow on one-layer softmax attentionarXiv:2403.08699, 2024Presented at ICLR 2024 Workshop on Bridging the Gap Between Practice and Theory in Deep Learning
- Approximate Message Passing for orthogonally invariant ensembles: Multivariate non-linearities and spectral initializationInformation and Inference: A Journal of the IMA, 2024
- Training dynamics of multi-head softmax attention for in-context learning: emergence, convergence, and optimalityConference on Learning Theory (COLT), 2024Presented at ICLR 2024 Workshop on Bridging the Gap Between Practice and Theory in Deep Learning
- Noise-adaptive Thompson sampling for linear contextual banditsIn Advances in Neural Information Processing Systems (NeurIPS), 2023