Minkai Xu「徐民凯」

I'm a CS PhD student at Stanford. Previously, I received my master degree from Montreal Institute for Learning Algorithms advised by Jian Tang, and bachelor degree (Summa Cum Laude) from Shanghai Jiao Tong University advised by Weinan Zhang. I also spent time at AI research labs of Meta, Amazon, and ByteDance.

My research is generously supported by Sequoia Capital Stanford Graduate Fellowship.

Email: minkai [at] stanford.edu

Google Scholar   /   Twitter   /   Ins   /   Github   /   LinkedIn     

profile photo
Research Topics

  • My general research interest lies in developing principled and interpretable probabilistic models to learn and infer about our complex and structured world, with an emphasis on generative models and unsupervised learning. I also study their intersections with geometric representation learning, a fast-growing subject about semi-structured data such as graphs and 3D structures.
  • I'm also devoted to enabling innovative ML algorithms to core real-world challenges with a broad societal impact, such as foundational problems in computational chemistry/material/biology.

Publications (* equal contribution)

Generative Coarse-Graining of Molecular Conformations
Wujie Wang,  Minkai Xu,  Chen Cai,  Benjamin Kurt Miller,  Tess Smidt,  Yusu Wang,  Jian Tang,  Rafael Gomez-Bombarelli 
International Conference on Machine Learning (ICML), 2022.
Computational chemistry; physics; E(3)-equivariance; graph neural networks.

GeoDiff: a Geometric Diffusion Model for Molecular Conformation Generation
Minkai Xu,  Lantao Yu,  Yang Song,  Chence Shi,  Stefano Ermon,  Jian Tang 
International Conference on Learning Representations (ICLR), 2022.
Oral Presentation [54/3391]
Geometric probabilistic models; Markov chains; SE(3)-equivariance; denoising diffusion.

Predicting Molecular Conformation via Dynamic Graph Score Matching
Shitong Luo*,  Chence Shi*,  Minkai Xu,  Jian Tang 
Neural Information Processing Systems (NeurIPS), 2021. 
Molecular 3D geometry generation; dynamic graph modeling; SE(3)-equivariance.

An End-to-End Framework for Molecular Conformation Generation via Bilevel Programming
Minkai Xu,  Wujie Wang,  Shitong Luo,  Chence Shi,  Yoshua Bengio,  Rafael Gomez-Bombarelli,  Jian Tang 
International Conference on Machine Learning (ICML), 2021. 
End-to-end learning for 3D geometry generation; bilevel optimization.

Learning Gradient Fields for Molecular Conformation Generation
Chence Shi*,  Shitong Luo*,  Minkai Xu,  Jian Tang 
International Conference on Machine Learning (ICML), 2021.
Long Talk [top 3.0%]
Molecular 3D geometry generation; denoising score-matching; SE(3)-equivariance.

Learning Neural Generative Dynamics for Molecular Conformation Generation
Minkai Xu*,  Shitong Luo*,  Yoshua Bengio,  Jian Peng,  Jian Tang 
International Conference on Learning Representations (ICLR), 2021. 
Molecular 3D structure generation; neural ODE; energy-based models.

Reciprocal Supervised Learning Improves Neural Machine Translation
Minkai Xu,  Mingxuan Wang,  Zhouhan Lin,  Hao Zhou,  Weinan Zhang,  Lei Li 
ArXiv Preprint
Learning NMT models with co-EM framework; text generation; expectation-maximization algorithm.

Energy-Based Imitation Learning
Minghuan Liu,  Tairan He,  Minkai Xu,  Weinan Zhang 
International Conference on Autonomous Agents and Multiagent Systems (AAMAS), 2021.
Imitation learning with energy-function as rewards; reinforcement learning; energy-based models.

Discriminator Contrastive Divergence: Semi-Amortized Generative Modeling by Exploring Energy of the Discriminator
Yuxuan Song*,  Qiwei Ye*,  Minkai Xu*,  Tie-yan Liu 
ArXiv Preprint
Bridging the gap between GANs and Energy-Based Models; generative models; contrastive divergence.

Towards Generalized Implementation of Wasserstein Distance in GANs
Minkai Xu,  Zhiming Zhou,  Guansong Lu,  Jian Tang,  Weinan Zhang,  Yong Yu 
AAAI Conference on Artificial Intelligence (AAAI), 2021. 
Novel sobolev duality of Wasserstein distances; Generative Adversarial Nets.

A Graph to Graphs Framework for Retrosynthesis Prediction
Chence Shi,  Minkai Xu,  Hongyu Guo,  Ming Zhang,  Jian Tang 
International Conference on Machine Learning (ICML), 2020. 
Retrosynthesis with variational methods; latent variable models; drug discovery.

GraphAF: a Flow-based Autoregressive Model for Molecular Graph Generation
Chence Shi*,  Minkai Xu*,  Zhaocheng Zhu,  Weinan Zhang,  Ming Zhang,  Jian Tang 
International Conference on Learning Representations (ICLR), 2020. 
Molecular graph generation and optimization; flow-based generative models; graph learning.

Infomax Neural Joint Source-Channel Coding via Adversarial Bit Flip
Yuxuan Song,  Minkai Xu,  Lantao Yu,  Hao Zhou,  Shuo Shao,  Yong Yu 
AAAI Conference on Artificial Intelligence (AAAI), 2020. 
Robust and efficient compression coding; information theory; implicit generative modeling.

  • Sequoia Capital Stanford Graduate Fellowship (7 fellows in CS), [Announcement], Stanford. 2022-2026
  • Best Bachelor Thesis (TOP 1%), Towards Generalized Wasserstein GANs [Dissertation][Video], SJTU. 2020
  • Mong Man Wai - Hong Kong Scholarship (~6,000 USD, 15 among 6,000+ sophomore and junior students in SJTU), SJTU. 2018
  • Kwang-Hua Scholarship (Sophomore Ranking 3rd/104), Kwang-Hua Foundation. 2018
  • Shanghai Scholarship (Ranking 2nd/104 of Freshman), Shanghai Education Ministry. 2017
Invited Talks
Professional Services
  • PC member/Reviewer: ICML'21-22, NeurIPS'21-22, ICLR'22-23, AAAI'21-23.

Updated at Sept. 2022
Thanks Jon Barron for this amazing work