Picture for Denis Kleyko

Denis Kleyko

Shammie

Perceptron Theory for Predicting the Accuracy of Neural Networks

Add code
Dec 14, 2020
Figure 1 for Perceptron Theory for Predicting the Accuracy of Neural Networks
Figure 2 for Perceptron Theory for Predicting the Accuracy of Neural Networks
Figure 3 for Perceptron Theory for Predicting the Accuracy of Neural Networks
Figure 4 for Perceptron Theory for Predicting the Accuracy of Neural Networks
Viaarxiv icon

End to End Binarized Neural Networks for Text Classification

Add code
Oct 11, 2020
Figure 1 for End to End Binarized Neural Networks for Text Classification
Figure 2 for End to End Binarized Neural Networks for Text Classification
Figure 3 for End to End Binarized Neural Networks for Text Classification
Figure 4 for End to End Binarized Neural Networks for Text Classification
Viaarxiv icon

Cellular Automata Can Reduce Memory Requirements of Collective-State Computing

Add code
Oct 07, 2020
Figure 1 for Cellular Automata Can Reduce Memory Requirements of Collective-State Computing
Figure 2 for Cellular Automata Can Reduce Memory Requirements of Collective-State Computing
Figure 3 for Cellular Automata Can Reduce Memory Requirements of Collective-State Computing
Figure 4 for Cellular Automata Can Reduce Memory Requirements of Collective-State Computing
Viaarxiv icon

Variable Binding for Sparse Distributed Representations: Theory and Applications

Add code
Sep 14, 2020
Figure 1 for Variable Binding for Sparse Distributed Representations: Theory and Applications
Figure 2 for Variable Binding for Sparse Distributed Representations: Theory and Applications
Figure 3 for Variable Binding for Sparse Distributed Representations: Theory and Applications
Figure 4 for Variable Binding for Sparse Distributed Representations: Theory and Applications
Viaarxiv icon

Commentaries on "Learning Sensorimotor Control with Neuromorphic Sensors: Toward Hyperdimensional Active Perception" [Science Robotics Vol. 4 Issue 30 (2019) 1-10

Add code
Mar 25, 2020
Figure 1 for Commentaries on "Learning Sensorimotor Control with Neuromorphic Sensors: Toward Hyperdimensional Active Perception" [Science Robotics Vol. 4 Issue 30 (2019) 1-10
Figure 2 for Commentaries on "Learning Sensorimotor Control with Neuromorphic Sensors: Toward Hyperdimensional Active Perception" [Science Robotics Vol. 4 Issue 30 (2019) 1-10
Viaarxiv icon

HyperEmbed: Tradeoffs Between Resources and Performance in NLP Tasks with Hyperdimensional Computing enabled Embedding of n-gram Statistics

Add code
Mar 03, 2020
Figure 1 for HyperEmbed: Tradeoffs Between Resources and Performance in NLP Tasks with Hyperdimensional Computing enabled Embedding of n-gram Statistics
Figure 2 for HyperEmbed: Tradeoffs Between Resources and Performance in NLP Tasks with Hyperdimensional Computing enabled Embedding of n-gram Statistics
Figure 3 for HyperEmbed: Tradeoffs Between Resources and Performance in NLP Tasks with Hyperdimensional Computing enabled Embedding of n-gram Statistics
Figure 4 for HyperEmbed: Tradeoffs Between Resources and Performance in NLP Tasks with Hyperdimensional Computing enabled Embedding of n-gram Statistics
Viaarxiv icon

Density Encoding Enables Resource-Efficient Randomly Connected Neural Networks

Add code
Sep 19, 2019
Figure 1 for Density Encoding Enables Resource-Efficient Randomly Connected Neural Networks
Figure 2 for Density Encoding Enables Resource-Efficient Randomly Connected Neural Networks
Figure 3 for Density Encoding Enables Resource-Efficient Randomly Connected Neural Networks
Figure 4 for Density Encoding Enables Resource-Efficient Randomly Connected Neural Networks
Viaarxiv icon

Integer Echo State Networks: Hyperdimensional Reservoir Computing

Add code
Sep 22, 2018
Figure 1 for Integer Echo State Networks: Hyperdimensional Reservoir Computing
Figure 2 for Integer Echo State Networks: Hyperdimensional Reservoir Computing
Figure 3 for Integer Echo State Networks: Hyperdimensional Reservoir Computing
Figure 4 for Integer Echo State Networks: Hyperdimensional Reservoir Computing
Viaarxiv icon

A theory of sequence indexing and working memory in recurrent neural networks

Add code
Feb 28, 2018
Viaarxiv icon

Theory of the superposition principle for randomized connectionist representations in neural networks

Add code
Jul 05, 2017
Figure 1 for Theory of the superposition principle for randomized connectionist representations in neural networks
Figure 2 for Theory of the superposition principle for randomized connectionist representations in neural networks
Figure 3 for Theory of the superposition principle for randomized connectionist representations in neural networks
Figure 4 for Theory of the superposition principle for randomized connectionist representations in neural networks
Viaarxiv icon