Alert button
Picture for Malte Rasch

Malte Rasch

Alert button

AnalogNAS: A Neural Network Design Framework for Accurate Inference with Analog In-Memory Computing

Add code
Bookmark button
Alert button
May 17, 2023
Hadjer Benmeziane, Corey Lammie, Irem Boybat, Malte Rasch, Manuel Le Gallo, Hsinyu Tsai, Ramachandran Muralidhar, Smail Niar, Ouarnoughi Hamza, Vijay Narayanan, Abu Sebastian, Kaoutar El Maghraoui

Figure 1 for AnalogNAS: A Neural Network Design Framework for Accurate Inference with Analog In-Memory Computing
Figure 2 for AnalogNAS: A Neural Network Design Framework for Accurate Inference with Analog In-Memory Computing
Figure 3 for AnalogNAS: A Neural Network Design Framework for Accurate Inference with Analog In-Memory Computing
Figure 4 for AnalogNAS: A Neural Network Design Framework for Accurate Inference with Analog In-Memory Computing
Viaarxiv icon

Zero-shifting Technique for Deep Neural Network Training on Resistive Cross-point Arrays

Add code
Bookmark button
Alert button
Aug 02, 2019
Hyungjun Kim, Malte Rasch, Tayfun Gokmen, Takashi Ando, Hiroyuki Miyazoe, Jae-Joon Kim, John Rozen, Seyoung Kim

Figure 1 for Zero-shifting Technique for Deep Neural Network Training on Resistive Cross-point Arrays
Figure 2 for Zero-shifting Technique for Deep Neural Network Training on Resistive Cross-point Arrays
Figure 3 for Zero-shifting Technique for Deep Neural Network Training on Resistive Cross-point Arrays
Figure 4 for Zero-shifting Technique for Deep Neural Network Training on Resistive Cross-point Arrays
Viaarxiv icon

Training LSTM Networks with Resistive Cross-Point Devices

Add code
Bookmark button
Alert button
Jun 01, 2018
Tayfun Gokmen, Malte Rasch, Wilfried Haensch

Figure 1 for Training LSTM Networks with Resistive Cross-Point Devices
Figure 2 for Training LSTM Networks with Resistive Cross-Point Devices
Figure 3 for Training LSTM Networks with Resistive Cross-Point Devices
Figure 4 for Training LSTM Networks with Resistive Cross-Point Devices
Viaarxiv icon