Get our free extension to see links to code for papers anywhere online!

 Add to Chrome

 Add to Firefox

CatalyzeX Code Finder - Browser extension linking code for ML papers across the web! | Product Hunt Embed
Theoretical Understanding of Batch-normalization: A Markov Chain Perspective

Mar 09, 2020
Hadi Daneshmand, Jonas Kohler, Francis Bach, Thomas Hofmann, Aurelien Lucchi


  Access Paper or Ask Questions

Mixing of Stochastic Accelerated Gradient Descent

Oct 31, 2019
Peiyuan Zhang, Hadi Daneshmand, Thomas Hofmann


  Access Paper or Ask Questions

Exponential convergence rates for Batch Normalization: The power of length-direction decoupling in non-convex optimization

Oct 06, 2018
Jonas Kohler, Hadi Daneshmand, Aurelien Lucchi, Ming Zhou, Klaus Neymeyr, Thomas Hofmann


  Access Paper or Ask Questions

Escaping Saddles with Stochastic Gradients

Sep 16, 2018
Hadi Daneshmand, Jonas Kohler, Aurelien Lucchi, Thomas Hofmann


  Access Paper or Ask Questions

Local Saddle Point Optimization: A Curvature Exploitation Approach

Jun 26, 2018
Leonard Adolphs, Hadi Daneshmand, Aurelien Lucchi, Thomas Hofmann


  Access Paper or Ask Questions

Accelerated Dual Learning by Homotopic Initialization

Jun 13, 2017
Hadi Daneshmand, Hamed Hassani, Thomas Hofmann


  Access Paper or Ask Questions

Starting Small -- Learning with Adaptive Sample Sizes

Oct 07, 2016
Hadi Daneshmand, Aurelien Lucchi, Thomas Hofmann


  Access Paper or Ask Questions

DynaNewton - Accelerating Newton's Method for Machine Learning

May 20, 2016
Hadi Daneshmand, Aurelien Lucchi, Thomas Hofmann


  Access Paper or Ask Questions

Estimating Diffusion Network Structures: Recovery Conditions, Sample Complexity & Soft-thresholding Algorithm

May 12, 2014
Hadi Daneshmand, Manuel Gomez-Rodriguez, Le Song, Bernhard Schoelkopf

* To appear in the 31st International Conference on Machine Learning (ICML), 2014 

  Access Paper or Ask Questions