Convergence to Second-Order Stationarity for Non-negative Matrix Factorization: Provably and Concurrently (with Stratis Skoulakis, Antonios Varvitsiotis and Xiao Wang).
Submitted, [Arxiv]

Last iterate convergence in no-regret learning: constrained min-max optimization for convex-concave landscapes (with Qi Lei, Sai Ganesh Nagarajan and Xiao Wang).
Submitted, [Arxiv]

Efficient Statistics for Sparse Graphical Models from Truncated Samples
(with Arnab Bhattacharya, Rathin Desai and Sai Ganesh Nagarajan)
Submitted, [Arxiv]

Better Depth-Width Trade-offs for Neural Networks through the lens of Dynamical Systems
(with Vaggos Chatziafratis and Sai Ganesh Nagarajan).
ICML 2020 [Arxiv]

Logistic regression with peer-group effects via inference in higher-order Ising models
(with Costis Daskalakis and Nishanth Dikkala).
AISTATS 2020 [Arxiv]

Depth-Width Trade-offs for ReLU Networks via Sharkovsky’s Theorem
(with Vaggos Chatziafratis, Sai Ganesh Nagarajan and Xiao Wang).
ICLR 2020 (spotlight) [Arxiv], [MIFODS Talk]

On the Analysis of EM for truncated mixtures of two Gaussians
(with Sai Ganesh Nagarajan).
ALT 2020 [Arxiv]


First-order methods Almost Always Avoid Saddle Points: The case of Vanishing Step-sizes
(with Xiao Wang and Georgios Piliouras).
NeurIPS 2019 [Arxiv]

Multiplicative Weights Updates as a distributed constrained optimization algorithm: Convergence to second-order stationary points almost always
(with Georgios Piliouras and Xiao Wang).
ICML 2019 [Arxiv]

Regression from Dependent Observations
(with Costis Daskalakis and Nishanth Dikkala).
STOC 2019 [Arxiv]

First-order Methods Almost Always Avoid Saddle Points
(with Jason D. Lee, Georgios Piliouras, Max Simchowitz, Michael I. Jordan and Benjamin Recht).
Math. Programming 2019, issue on non-convex optimization for statistical learning. [Arxiv]
Have a look at this nice exposition about our work!

Last-Iterate Convergence: Zero-Sum Games and Constrained Min-Max Optimization
(with Costis Daskalakis).
ITCS 2019 [Arxiv], [Slides]


The Limit Points of (Optimistic) Gradient Descent in Min-Max Optimization
(with Costis Daskalakis).
NeurIPS 2018 [Arxiv], [Poster]

Cycles in Zero Sum Differential Games and Biological Diversity
(with Tung Mai, Milena Mihail, Will Ratcliff, Vijay V. Vazirani and Peter Yunker).
EC 2018 [Arxiv], [Slides]


Multiplicative Weights Update with Constant step-size in Congestion Games: Convergence, Limit Cycles and Chaos (with Gerasimos Palaiopanos and Georgios Piliouras).
NeurIPS 2017 (spotlight) [Arxiv], [Poster], [Video]

Opinion Dynamics in Networks: Convergence, Stability and Lack of Explosion
(with Tung Mai and Vijay V. Vazirani).
ICALP 2017 [Arxiv], [Slides]

Gradient Descent Only Converges to Minimizers: Non-Isolated Critical Points and Invariant Regions (with Georgios Piliouras).
ITCS 2017 [Arxiv], [Slides], [Video]

Mutation, Sexual Reproduction and Survival in Dynamic Environments
(with Ruta Mehta, Georgios Piliouras, Prasad Tetali and Vijay V. Vazirani).
ITCS 2017 [Arxiv]

Before 2017

The Computational Complexity of Genetic Diversity
(with Ruta Mehta, Georgios Piliouras and Sadra Yazdanbod).
ESA 2016 [Arxiv], [Slides]

Average Case Performance of Replicator Dynamics in Potential Games via Computing Regions of Attraction (with Georgios Piliouras).
EC 2016 [Arxiv]

Mixing Time of Markov Chains, Dynamical Systems and Evolution
(with Nisheeth K. Vishnoi).
ICALP 2016 [PDF]

Evolutionary Dynamics in finite populations mix rapidly
(with Piyush Srivastava and Nisheeth K. Vishnoi).
SODA 2016 [PDF], [Slides]

Natural Selection as an Inhibitor of Genetic Diversity: Multiplicative Weights Updates Algorithm and a Conjecture of Haploid Genetics (with Ruta Mehta and Georgios Piliouras).
ITCS 2015 [Arxiv], [Slides]

Support-theoretic subgraph preconditioners for large-scale SLAM
(with Yong-Dian Jian, Doru Balcan, Prasad Tetali, Frank Dellaert).
IROS 2013 [PDF]


PhD: Evolutionary Markov Chains, potential games and optimization under the lens of dynamical systems. [PDF]