Alert button
Picture for Chris Eliasmith

Chris Eliasmith

Alert button

Debugging using Orthogonal Gradient Descent

Add code
Bookmark button
Alert button
Jun 17, 2022
Narsimha Chilkuri, Chris Eliasmith

Figure 1 for Debugging using Orthogonal Gradient Descent
Viaarxiv icon

Language Modeling using LMUs: 10x Better Data Efficiency or Improved Scaling Compared to Transformers

Add code
Bookmark button
Alert button
Oct 05, 2021
Narsimha Chilkuri, Eric Hunsberger, Aaron Voelker, Gurshaant Malik, Chris Eliasmith

Figure 1 for Language Modeling using LMUs: 10x Better Data Efficiency or Improved Scaling Compared to Transformers
Figure 2 for Language Modeling using LMUs: 10x Better Data Efficiency or Improved Scaling Compared to Transformers
Figure 3 for Language Modeling using LMUs: 10x Better Data Efficiency or Improved Scaling Compared to Transformers
Figure 4 for Language Modeling using LMUs: 10x Better Data Efficiency or Improved Scaling Compared to Transformers
Viaarxiv icon

A Spiking Neural Network for Image Segmentation

Add code
Bookmark button
Alert button
Jun 16, 2021
Kinjal Patel, Eric Hunsberger, Sean Batir, Chris Eliasmith

Figure 1 for A Spiking Neural Network for Image Segmentation
Figure 2 for A Spiking Neural Network for Image Segmentation
Figure 3 for A Spiking Neural Network for Image Segmentation
Figure 4 for A Spiking Neural Network for Image Segmentation
Viaarxiv icon

Parallelizing Legendre Memory Unit Training

Add code
Bookmark button
Alert button
Feb 22, 2021
Narsimha Chilkuri, Chris Eliasmith

Figure 1 for Parallelizing Legendre Memory Unit Training
Figure 2 for Parallelizing Legendre Memory Unit Training
Figure 3 for Parallelizing Legendre Memory Unit Training
Figure 4 for Parallelizing Legendre Memory Unit Training
Viaarxiv icon

Hardware Aware Training for Efficient Keyword Spotting on General Purpose and Specialized Hardware

Add code
Bookmark button
Alert button
Sep 23, 2020
Peter Blouw, Gurshaant Malik, Benjamin Morcos, Aaron R. Voelker, Chris Eliasmith

Figure 1 for Hardware Aware Training for Efficient Keyword Spotting on General Purpose and Specialized Hardware
Figure 2 for Hardware Aware Training for Efficient Keyword Spotting on General Purpose and Specialized Hardware
Figure 3 for Hardware Aware Training for Efficient Keyword Spotting on General Purpose and Specialized Hardware
Figure 4 for Hardware Aware Training for Efficient Keyword Spotting on General Purpose and Specialized Hardware
Viaarxiv icon

Low-Power Low-Latency Keyword Spotting and Adaptive Control with a SpiNNaker 2 Prototype and Comparison with Loihi

Add code
Bookmark button
Alert button
Sep 18, 2020
Yexin Yan, Terrence C. Stewart, Xuan Choo, Bernhard Vogginger, Johannes Partzsch, Sebastian Hoeppner, Florian Kelber, Chris Eliasmith, Steve Furber, Christian Mayr

Figure 1 for Low-Power Low-Latency Keyword Spotting and Adaptive Control with a SpiNNaker 2 Prototype and Comparison with Loihi
Figure 2 for Low-Power Low-Latency Keyword Spotting and Adaptive Control with a SpiNNaker 2 Prototype and Comparison with Loihi
Figure 3 for Low-Power Low-Latency Keyword Spotting and Adaptive Control with a SpiNNaker 2 Prototype and Comparison with Loihi
Figure 4 for Low-Power Low-Latency Keyword Spotting and Adaptive Control with a SpiNNaker 2 Prototype and Comparison with Loihi
Viaarxiv icon

Nengo and low-power AI hardware for robust, embedded neurorobotics

Add code
Bookmark button
Alert button
Aug 29, 2020
Travis DeWolf, Pawel Jaworski, Chris Eliasmith

Figure 1 for Nengo and low-power AI hardware for robust, embedded neurorobotics
Figure 2 for Nengo and low-power AI hardware for robust, embedded neurorobotics
Viaarxiv icon

A Spike in Performance: Training Hybrid-Spiking Neural Networks with Quantized Activation Functions

Add code
Bookmark button
Alert button
Feb 10, 2020
Aaron R. Voelker, Daniel Rasmussen, Chris Eliasmith

Figure 1 for A Spike in Performance: Training Hybrid-Spiking Neural Networks with Quantized Activation Functions
Figure 2 for A Spike in Performance: Training Hybrid-Spiking Neural Networks with Quantized Activation Functions
Figure 3 for A Spike in Performance: Training Hybrid-Spiking Neural Networks with Quantized Activation Functions
Figure 4 for A Spike in Performance: Training Hybrid-Spiking Neural Networks with Quantized Activation Functions
Viaarxiv icon