Get our free extension to see links to code for papers anywhere online!

Chrome logo Add to Chrome

Firefox logo Add to Firefox

Picture for Atsushi Nitanda

BODAME: Bilevel Optimization for Defense Against Model Extraction


Mar 11, 2021
Yuto Mori, Atsushi Nitanda, Akiko Takeda

* 18 pages 

  Access Paper or Ask Questions

Particle Dual Averaging: Optimization of Mean Field Neural Networks with Global Convergence Rate Analysis


Dec 31, 2020
Atsushi Nitanda, Denny Wu, Taiji Suzuki

* 32 pages 

  Access Paper or Ask Questions

A Novel Global Spatial Attention Mechanism in Convolutional Neural Network for Medical Image Classification


Jul 31, 2020
Linchuan Xu, Jun Huang, Atsushi Nitanda, Ryo Asaoka, Kenji Yamanishi


  Access Paper or Ask Questions

Online Robust and Adaptive Learning from Data Streams


Jul 23, 2020
Shintaro Fukushima, Atsushi Nitanda, Kenji Yamanishi

* 25 pages 

  Access Paper or Ask Questions

When Does Preconditioning Help or Hurt Generalization?


Jul 02, 2020
Shun-ichi Amari, Jimmy Ba, Roger Grosse, Xuechen Li, Atsushi Nitanda, Taiji Suzuki, Denny Wu, Ji Xu

* 38 pages 

  Access Paper or Ask Questions

Optimal Rates for Averaged Stochastic Gradient Descent under Neural Tangent Kernel Regime


Jun 22, 2020
Atsushi Nitanda, Taiji Suzuki

* 36 pages 

  Access Paper or Ask Questions

Exponential Convergence Rates of Classification Errors on Learning with SGD and Random Features


Nov 13, 2019
Shingo Yashima, Atsushi Nitanda, Taiji Suzuki


  Access Paper or Ask Questions

Deep learning is adaptive to intrinsic dimensionality of model smoothness in anisotropic Besov space


Oct 28, 2019
Taiji Suzuki, Atsushi Nitanda


  Access Paper or Ask Questions

Data Cleansing for Models Trained with SGD


Jun 20, 2019
Satoshi Hara, Atsushi Nitanda, Takanori Maehara


  Access Paper or Ask Questions

Refined Generalization Analysis of Gradient Descent for Over-parameterized Two-layer Neural Networks with Smooth Activations on Classification Problems


Jun 07, 2019
Atsushi Nitanda, Taiji Suzuki

* 18 pages 

  Access Paper or Ask Questions

Functional Gradient Boosting based on Residual Network Perception


Jul 08, 2018
Atsushi Nitanda, Taiji Suzuki

* 22 pages, 1 figure, 1 table. An extended version of ICML 2018 paper 

  Access Paper or Ask Questions

Gradient Layer: Enhancing the Convergence of Adversarial Training for Generative Models


Jun 14, 2018
Atsushi Nitanda, Taiji Suzuki

* 14 pages, 4 figures, AISTATS2018 

  Access Paper or Ask Questions

Stochastic Gradient Descent with Exponential Convergence Rates of Expected Classification Errors


Jun 14, 2018
Atsushi Nitanda, Taiji Suzuki

* 15 pages, 2 figures 

  Access Paper or Ask Questions

Stochastic Particle Gradient Descent for Infinite Ensembles


Dec 14, 2017
Atsushi Nitanda, Taiji Suzuki

* 33 pages, 1 figure 

  Access Paper or Ask Questions

Accelerated Stochastic Gradient Descent for Minimizing Finite Sums


Jun 10, 2015
Atsushi Nitanda

* [v2] corrected citation to proxSVRG, corrected typos in Figure 1(option2) and 3(R4 -> R3) 

  Access Paper or Ask Questions