Alert button
Picture for Michael W. Mahoney

Michael W. Mahoney

Alert button

A Heavy-Tailed Algebra for Probabilistic Programming

Add code
Bookmark button
Alert button
Jun 15, 2023
Feynman Liang, Liam Hodgkinson, Michael W. Mahoney

Viaarxiv icon

SqueezeLLM: Dense-and-Sparse Quantization

Add code
Bookmark button
Alert button
Jun 13, 2023
Sehoon Kim, Coleman Hooper, Amir Gholami, Zhen Dong, Xiuyu Li, Sheng Shen, Michael W. Mahoney, Kurt Keutzer

Figure 1 for SqueezeLLM: Dense-and-Sparse Quantization
Figure 2 for SqueezeLLM: Dense-and-Sparse Quantization
Figure 3 for SqueezeLLM: Dense-and-Sparse Quantization
Figure 4 for SqueezeLLM: Dense-and-Sparse Quantization
Viaarxiv icon

A Three-regime Model of Network Pruning

Add code
Bookmark button
Alert button
May 28, 2023
Yefan Zhou, Yaoqing Yang, Arin Chang, Michael W. Mahoney

Figure 1 for A Three-regime Model of Network Pruning
Figure 2 for A Three-regime Model of Network Pruning
Figure 3 for A Three-regime Model of Network Pruning
Figure 4 for A Three-regime Model of Network Pruning
Viaarxiv icon

Constrained Optimization via Exact Augmented Lagrangian and Randomized Iterative Sketching

Add code
Bookmark button
Alert button
May 28, 2023
Ilgee Hong, Sen Na, Michael W. Mahoney, Mladen Kolar

Figure 1 for Constrained Optimization via Exact Augmented Lagrangian and Randomized Iterative Sketching
Figure 2 for Constrained Optimization via Exact Augmented Lagrangian and Randomized Iterative Sketching
Figure 3 for Constrained Optimization via Exact Augmented Lagrangian and Randomized Iterative Sketching
Figure 4 for Constrained Optimization via Exact Augmented Lagrangian and Randomized Iterative Sketching
Viaarxiv icon

When are ensembles really effective?

Add code
Bookmark button
Alert button
May 21, 2023
Ryan Theisen, Hyunsuk Kim, Yaoqing Yang, Liam Hodgkinson, Michael W. Mahoney

Figure 1 for When are ensembles really effective?
Figure 2 for When are ensembles really effective?
Figure 3 for When are ensembles really effective?
Figure 4 for When are ensembles really effective?
Viaarxiv icon

End-to-end codesign of Hessian-aware quantized neural networks for FPGAs and ASICs

Add code
Bookmark button
Alert button
Apr 13, 2023
Javier Campos, Zhen Dong, Javier Duarte, Amir Gholami, Michael W. Mahoney, Jovan Mitrevski, Nhan Tran

Figure 1 for End-to-end codesign of Hessian-aware quantized neural networks for FPGAs and ASICs
Figure 2 for End-to-end codesign of Hessian-aware quantized neural networks for FPGAs and ASICs
Figure 3 for End-to-end codesign of Hessian-aware quantized neural networks for FPGAs and ASICs
Figure 4 for End-to-end codesign of Hessian-aware quantized neural networks for FPGAs and ASICs
Viaarxiv icon

Full Stack Optimization of Transformer Inference: a Survey

Add code
Bookmark button
Alert button
Feb 27, 2023
Sehoon Kim, Coleman Hooper, Thanakul Wattanawong, Minwoo Kang, Ruohan Yan, Hasan Genc, Grace Dinh, Qijing Huang, Kurt Keutzer, Michael W. Mahoney, Yakun Sophia Shao, Amir Gholami

Figure 1 for Full Stack Optimization of Transformer Inference: a Survey
Figure 2 for Full Stack Optimization of Transformer Inference: a Survey
Figure 3 for Full Stack Optimization of Transformer Inference: a Survey
Figure 4 for Full Stack Optimization of Transformer Inference: a Survey
Viaarxiv icon

Learning Physical Models that Can Respect Conservation Laws

Add code
Bookmark button
Alert button
Feb 21, 2023
Derek Hansen, Danielle C. Maddix, Shima Alizadeh, Gaurav Gupta, Michael W. Mahoney

Figure 1 for Learning Physical Models that Can Respect Conservation Laws
Figure 2 for Learning Physical Models that Can Respect Conservation Laws
Figure 3 for Learning Physical Models that Can Respect Conservation Laws
Figure 4 for Learning Physical Models that Can Respect Conservation Laws
Viaarxiv icon

Big Little Transformer Decoder

Add code
Bookmark button
Alert button
Feb 15, 2023
Sehoon Kim, Karttikeya Mangalam, Jitendra Malik, Michael W. Mahoney, Amir Gholami, Kurt Keutzer

Figure 1 for Big Little Transformer Decoder
Figure 2 for Big Little Transformer Decoder
Figure 3 for Big Little Transformer Decoder
Figure 4 for Big Little Transformer Decoder
Viaarxiv icon

Gated Recurrent Neural Networks with Weighted Time-Delay Feedback

Add code
Bookmark button
Alert button
Dec 01, 2022
N. Benjamin Erichson, Soon Hoe Lim, Michael W. Mahoney

Figure 1 for Gated Recurrent Neural Networks with Weighted Time-Delay Feedback
Figure 2 for Gated Recurrent Neural Networks with Weighted Time-Delay Feedback
Figure 3 for Gated Recurrent Neural Networks with Weighted Time-Delay Feedback
Figure 4 for Gated Recurrent Neural Networks with Weighted Time-Delay Feedback
Viaarxiv icon