Get our free extension to see links to code for papers anywhere online!

 Add to Chrome

 Add to Firefox

CatalyzeX Code Finder - Browser extension linking code for ML papers across the web! | Product Hunt Embed
Beyond Lazy Training for Over-parameterized Tensor Decomposition

Oct 22, 2020
Xiang Wang, Chenwei Wu, Jason D. Lee, Tengyu Ma, Rong Ge

* NeurIPS 2020; the first two authors contribute equally 

  Access Paper or Ask Questions

Dissecting Hessian: Understanding Common Structure of Hessian in Neural Networks

Oct 08, 2020
Yikai Wu, Xingyu Zhu, Chenwei Wu, Annie Wang, Rong Ge

* 29 pages, 26 figures. Main text: 8 pages, 6 figures. First two authors have equal contribution and are in alphabetical order 

  Access Paper or Ask Questions

Efficient sampling from the Bingham distribution

Sep 30, 2020
Rong Ge, Holden Lee, Jianfeng Lu, Andrej Risteski


  Access Paper or Ask Questions

Guarantees for Tuning the Step Size using a Learning-to-Learn Approach

Jun 30, 2020
Xiang Wang, Shuai Yuan, Chenwei Wu, Rong Ge


  Access Paper or Ask Questions

Optimization Landscape of Tucker Decomposition

Jun 29, 2020
Abraham Frandsen, Rong Ge


  Access Paper or Ask Questions

Extracting Latent State Representations with Linear Dynamics from Rich Observations

Jun 29, 2020
Abraham Frandsen, Rong Ge


  Access Paper or Ask Questions

Energy-Aware DNN Graph Optimization

May 12, 2020
Yu Wang, Rong Ge, Shuang Qiu


  Access Paper or Ask Questions

High-Dimensional Robust Mean Estimation via Gradient Descent

May 04, 2020
Yu Cheng, Ilias Diakonikolas, Rong Ge, Mahdi Soltanolkotabi

* Under submission to ICML'20 

  Access Paper or Ask Questions

Spectral Learning on Matrices and Tensors

Apr 16, 2020
Majid Janzamin, Rong Ge, Jean Kossaifi, Anima Anandkumar

* Foundations and Trends in Machine Learning: Vol. 12: No. 5-6, pp 393-536 (2019) 

  Access Paper or Ask Questions

Estimating Normalizing Constants for Log-Concave Distributions: Algorithms and Lower Bounds

Nov 08, 2019
Rong Ge, Holden Lee, Jianfeng Lu

* 45 pages 

  Access Paper or Ask Questions

Mildly Overparametrized Neural Nets can Memorize Training Data Efficiently

Sep 26, 2019
Rong Ge, Runzhe Wang, Haoyu Zhao

* 38 pages 

  Access Paper or Ask Questions

Explaining Landscape Connectivity of Low-cost Solutions for Multilayer Nets

Jun 14, 2019
Rohith Kuditipudi, Xiang Wang, Holden Lee, Yi Zhang, Zhiyuan Li, Wei Hu, Sanjeev Arora, Rong Ge


  Access Paper or Ask Questions

Faster Algorithms for High-Dimensional Robust Covariance Estimation

Jun 11, 2019
Yu Cheng, Ilias Diakonikolas, Rong Ge, David Woodruff


  Access Paper or Ask Questions

Stabilized SVRG: Simple Variance Reduction for Nonconvex Optimization

May 01, 2019
Rong Ge, Zhize Li, Weiyao Wang, Xiang Wang


  Access Paper or Ask Questions

The Step Decay Schedule: A Near Optimal, Geometrically Decaying Learning Rate Procedure

Apr 29, 2019
Rong Ge, Sham M. Kakade, Rahul Kidambi, Praneeth Netrapalli

* 25 pages, 5 tables, 5 figures 

  Access Paper or Ask Questions

Stochastic Gradient Descent Escapes Saddle Points Efficiently

Feb 13, 2019
Chi Jin, Praneeth Netrapalli, Rong Ge, Sham M. Kakade, Michael I. Jordan


  Access Paper or Ask Questions

A Short Note on Concentration Inequalities for Random Vectors with SubGaussian Norm

Feb 11, 2019
Chi Jin, Praneeth Netrapalli, Rong Ge, Sham M. Kakade, Michael I. Jordan


  Access Paper or Ask Questions

Understanding Composition of Word Embeddings via Tensor Decomposition

Feb 02, 2019
Abraham Frandsen, Rong Ge


  Access Paper or Ask Questions

Simulated Tempering Langevin Monte Carlo II: An Improved Proof using Soft Markov Chain Decomposition

Nov 29, 2018
Rong Ge, Holden Lee, Andrej Risteski

* Advances in Neural Information Processing Systems 31 (2018) 
* 55 pages. arXiv admin note: text overlap with arXiv:1710.02736 

  Access Paper or Ask Questions

High-Dimensional Robust Mean Estimation in Nearly-Linear Time

Nov 23, 2018
Yu Cheng, Ilias Diakonikolas, Rong Ge


  Access Paper or Ask Questions

Stronger generalization bounds for deep nets via a compression approach

Nov 05, 2018
Sanjeev Arora, Rong Ge, Behnam Neyshabur, Yi Zhang


  Access Paper or Ask Questions

Global Convergence of Policy Gradient Methods for the Linear Quadratic Regulator

Oct 21, 2018
Maryam Fazel, Rong Ge, Sham M. Kakade, Mehran Mesbahi


  Access Paper or Ask Questions

On the Local Minima of the Empirical Risk

Oct 17, 2018
Chi Jin, Lydia T. Liu, Rong Ge, Michael I. Jordan

* To appear in NIPS 2018 

  Access Paper or Ask Questions

Learning Two-layer Neural Networks with Symmetric Inputs

Oct 16, 2018
Rong Ge, Rohith Kuditipudi, Zhize Li, Xiang Wang


  Access Paper or Ask Questions

Non-Convex Matrix Completion Against a Semi-Random Adversary

Sep 07, 2018
Yu Cheng, Rong Ge

* added references and fixed typos 

  Access Paper or Ask Questions

Matrix Completion has No Spurious Local Minimum

Jul 22, 2018
Rong Ge, Jason D. Lee, Tengyu Ma

* NIPS'16 best student paper. fixed Theorem 2.3 in preliminary section in the previous version. The results are not affected 

  Access Paper or Ask Questions

Beyond Log-concavity: Provable Guarantees for Sampling Multi-modal Distributions using Simulated Tempering Langevin Monte Carlo

Nov 06, 2017
Rong Ge, Holden Lee, Andrej Risteski

* 53 pages 

  Access Paper or Ask Questions

Learning One-hidden-layer Neural Networks with Landscape Design

Nov 03, 2017
Rong Ge, Jason D. Lee, Tengyu Ma


  Access Paper or Ask Questions

Generalization and Equilibrium in Generative Adversarial Nets (GANs)

Aug 01, 2017
Sanjeev Arora, Rong Ge, Yingyu Liang, Tengyu Ma, Yi Zhang

* This is an updated version of an ICML'17 paper with the same title. The main difference is that in the ICML'17 version the pure equilibrium result was only proved for Wasserstein GAN. In the current version the result applies to most reasonable training objectives. In particular, Theorem 4.3 now applies to both original GAN and Wasserstein GAN 

  Access Paper or Ask Questions

On the Optimization Landscape of Tensor Decompositions

Jun 18, 2017
Rong Ge, Tengyu Ma

* Best paper in the NIPS 2016 Workshop on Nonconvex Optimization for Machine Learning: Theory and Practice. In submission 

  Access Paper or Ask Questions