Alert button
Picture for Lingchuan Meng

Lingchuan Meng

Alert button

Restructurable Activation Networks

Add code
Bookmark button
Alert button
Aug 17, 2022
Kartikeya Bhardwaj, James Ward, Caleb Tung, Dibakar Gope, Lingchuan Meng, Igor Fedorov, Alex Chalfin, Paul Whatmough, Danny Loh

Figure 1 for Restructurable Activation Networks
Figure 2 for Restructurable Activation Networks
Figure 3 for Restructurable Activation Networks
Figure 4 for Restructurable Activation Networks
Viaarxiv icon

Armour: Generalizable Compact Self-Attention for Vision Transformers

Add code
Bookmark button
Alert button
Aug 03, 2021
Lingchuan Meng

Figure 1 for Armour: Generalizable Compact Self-Attention for Vision Transformers
Figure 2 for Armour: Generalizable Compact Self-Attention for Vision Transformers
Figure 3 for Armour: Generalizable Compact Self-Attention for Vision Transformers
Figure 4 for Armour: Generalizable Compact Self-Attention for Vision Transformers
Viaarxiv icon

Collapsible Linear Blocks for Super-Efficient Super Resolution

Add code
Bookmark button
Alert button
Mar 17, 2021
Kartikeya Bhardwaj, Milos Milosavljevic, Alex Chalfin, Naveen Suda, Liam O'Neil, Dibakar Gope, Lingchuan Meng, Ramon Matas, Danny Loh

Figure 1 for Collapsible Linear Blocks for Super-Efficient Super Resolution
Figure 2 for Collapsible Linear Blocks for Super-Efficient Super Resolution
Figure 3 for Collapsible Linear Blocks for Super-Efficient Super Resolution
Figure 4 for Collapsible Linear Blocks for Super-Efficient Super Resolution
Viaarxiv icon

Efficient Winograd Convolution via Integer Arithmetic

Add code
Bookmark button
Alert button
Jan 07, 2019
Lingchuan Meng, John Brothers

Figure 1 for Efficient Winograd Convolution via Integer Arithmetic
Figure 2 for Efficient Winograd Convolution via Integer Arithmetic
Figure 3 for Efficient Winograd Convolution via Integer Arithmetic
Figure 4 for Efficient Winograd Convolution via Integer Arithmetic
Viaarxiv icon