Alert button
Picture for Haim Sompolinsky

Haim Sompolinsky

Alert button

Probing Biological and Artificial Neural Networks with Task-dependent Neural Manifolds

Add code
Bookmark button
Alert button
Dec 21, 2023
Michael Kuoch, Chi-Ning Chou, Nikhil Parthasarathy, Joel Dapello, James J. DiCarlo, Haim Sompolinsky, SueYeon Chung

Viaarxiv icon

Connecting NTK and NNGP: A Unified Theoretical Framework for Neural Network Learning Dynamics in the Kernel Regime

Add code
Bookmark button
Alert button
Sep 08, 2023
Yehonatan Avidan, Qianyi Li, Haim Sompolinsky

Figure 1 for Connecting NTK and NNGP: A Unified Theoretical Framework for Neural Network Learning Dynamics in the Kernel Regime
Figure 2 for Connecting NTK and NNGP: A Unified Theoretical Framework for Neural Network Learning Dynamics in the Kernel Regime
Figure 3 for Connecting NTK and NNGP: A Unified Theoretical Framework for Neural Network Learning Dynamics in the Kernel Regime
Figure 4 for Connecting NTK and NNGP: A Unified Theoretical Framework for Neural Network Learning Dynamics in the Kernel Regime
Viaarxiv icon

Globally Gated Deep Linear Networks

Add code
Bookmark button
Alert button
Oct 31, 2022
Qianyi Li, Haim Sompolinsky

Figure 1 for Globally Gated Deep Linear Networks
Figure 2 for Globally Gated Deep Linear Networks
Figure 3 for Globally Gated Deep Linear Networks
Figure 4 for Globally Gated Deep Linear Networks
Viaarxiv icon

A theory of learning with constrained weight-distribution

Add code
Bookmark button
Alert button
Jun 14, 2022
Weishun Zhong, Ben Sorscher, Daniel D Lee, Haim Sompolinsky

Figure 1 for A theory of learning with constrained weight-distribution
Figure 2 for A theory of learning with constrained weight-distribution
Figure 3 for A theory of learning with constrained weight-distribution
Figure 4 for A theory of learning with constrained weight-distribution
Viaarxiv icon

Temporal support vectors for spiking neuronal networks

Add code
Bookmark button
Alert button
May 28, 2022
Ran Rubin, Haim Sompolinsky

Figure 1 for Temporal support vectors for spiking neuronal networks
Figure 2 for Temporal support vectors for spiking neuronal networks
Figure 3 for Temporal support vectors for spiking neuronal networks
Figure 4 for Temporal support vectors for spiking neuronal networks
Viaarxiv icon

Optimal quadratic binding for relational reasoning in vector symbolic neural architectures

Add code
Bookmark button
Alert button
Apr 14, 2022
Naoki Hiratani, Haim Sompolinsky

Figure 1 for Optimal quadratic binding for relational reasoning in vector symbolic neural architectures
Figure 2 for Optimal quadratic binding for relational reasoning in vector symbolic neural architectures
Figure 3 for Optimal quadratic binding for relational reasoning in vector symbolic neural architectures
Figure 4 for Optimal quadratic binding for relational reasoning in vector symbolic neural architectures
Viaarxiv icon

Soft-margin classification of object manifolds

Add code
Bookmark button
Alert button
Mar 14, 2022
Uri Cohen, Haim Sompolinsky

Figure 1 for Soft-margin classification of object manifolds
Figure 2 for Soft-margin classification of object manifolds
Figure 3 for Soft-margin classification of object manifolds
Figure 4 for Soft-margin classification of object manifolds
Viaarxiv icon

Statistical Mechanics of Deep Linear Neural Networks: The Back-Propagating Renormalization Group

Add code
Bookmark button
Alert button
Dec 07, 2020
Qianyi Li, Haim Sompolinsky

Figure 1 for Statistical Mechanics of Deep Linear Neural Networks: The Back-Propagating Renormalization Group
Figure 2 for Statistical Mechanics of Deep Linear Neural Networks: The Back-Propagating Renormalization Group
Figure 3 for Statistical Mechanics of Deep Linear Neural Networks: The Back-Propagating Renormalization Group
Figure 4 for Statistical Mechanics of Deep Linear Neural Networks: The Back-Propagating Renormalization Group
Viaarxiv icon

A new role for circuit expansion for learning in neural networks

Add code
Bookmark button
Alert button
Aug 19, 2020
Julia Steinberg, Madhu Advani, Haim Sompolinsky

Figure 1 for A new role for circuit expansion for learning in neural networks
Figure 2 for A new role for circuit expansion for learning in neural networks
Figure 3 for A new role for circuit expansion for learning in neural networks
Figure 4 for A new role for circuit expansion for learning in neural networks
Viaarxiv icon

Predicting the outputs of finite networks trained with noisy gradients

Add code
Bookmark button
Alert button
Apr 02, 2020
Gadi Naveh, Oded Ben-David, Haim Sompolinsky, Zohar Ringel

Figure 1 for Predicting the outputs of finite networks trained with noisy gradients
Figure 2 for Predicting the outputs of finite networks trained with noisy gradients
Figure 3 for Predicting the outputs of finite networks trained with noisy gradients
Figure 4 for Predicting the outputs of finite networks trained with noisy gradients
Viaarxiv icon