Alert button
Picture for Itay Hubara

Itay Hubara

Alert button

Train longer, generalize better: closing the generalization gap in large batch training of neural networks

Add code
Bookmark button
Alert button
Jan 01, 2018
Elad Hoffer, Itay Hubara, Daniel Soudry

Figure 1 for Train longer, generalize better: closing the generalization gap in large batch training of neural networks
Figure 2 for Train longer, generalize better: closing the generalization gap in large batch training of neural networks
Figure 3 for Train longer, generalize better: closing the generalization gap in large batch training of neural networks
Figure 4 for Train longer, generalize better: closing the generalization gap in large batch training of neural networks
Viaarxiv icon

Playing SNES in the Retro Learning Environment

Add code
Bookmark button
Alert button
Feb 07, 2017
Nadav Bhonker, Shai Rozenberg, Itay Hubara

Figure 1 for Playing SNES in the Retro Learning Environment
Figure 2 for Playing SNES in the Retro Learning Environment
Figure 3 for Playing SNES in the Retro Learning Environment
Figure 4 for Playing SNES in the Retro Learning Environment
Viaarxiv icon

Spatial contrasting for deep unsupervised learning

Add code
Bookmark button
Alert button
Nov 21, 2016
Elad Hoffer, Itay Hubara, Nir Ailon

Figure 1 for Spatial contrasting for deep unsupervised learning
Figure 2 for Spatial contrasting for deep unsupervised learning
Viaarxiv icon

Deep unsupervised learning through spatial contrasting

Add code
Bookmark button
Alert button
Oct 02, 2016
Elad Hoffer, Itay Hubara, Nir Ailon

Figure 1 for Deep unsupervised learning through spatial contrasting
Figure 2 for Deep unsupervised learning through spatial contrasting
Figure 3 for Deep unsupervised learning through spatial contrasting
Figure 4 for Deep unsupervised learning through spatial contrasting
Viaarxiv icon

Quantized Neural Networks: Training Neural Networks with Low Precision Weights and Activations

Add code
Bookmark button
Alert button
Sep 22, 2016
Itay Hubara, Matthieu Courbariaux, Daniel Soudry, Ran El-Yaniv, Yoshua Bengio

Figure 1 for Quantized Neural Networks: Training Neural Networks with Low Precision Weights and Activations
Figure 2 for Quantized Neural Networks: Training Neural Networks with Low Precision Weights and Activations
Figure 3 for Quantized Neural Networks: Training Neural Networks with Low Precision Weights and Activations
Figure 4 for Quantized Neural Networks: Training Neural Networks with Low Precision Weights and Activations
Viaarxiv icon

Binarized Neural Networks: Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1

Add code
Bookmark button
Alert button
Mar 17, 2016
Matthieu Courbariaux, Itay Hubara, Daniel Soudry, Ran El-Yaniv, Yoshua Bengio

Figure 1 for Binarized Neural Networks: Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1
Figure 2 for Binarized Neural Networks: Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1
Figure 3 for Binarized Neural Networks: Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1
Figure 4 for Binarized Neural Networks: Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1
Viaarxiv icon

Binarized Neural Networks

Add code
Bookmark button
Alert button
Mar 10, 2016
Itay Hubara, Daniel Soudry, Ran El Yaniv

Figure 1 for Binarized Neural Networks
Figure 2 for Binarized Neural Networks
Figure 3 for Binarized Neural Networks
Figure 4 for Binarized Neural Networks
Viaarxiv icon