Alert button
Picture for Zhenyu Gu

Zhenyu Gu

Alert button

Yi: Open Foundation Models by 01.AI

Add code
Bookmark button
Alert button
Mar 07, 2024
01. AI, :, Alex Young, Bei Chen, Chao Li, Chengen Huang, Ge Zhang, Guanwei Zhang, Heng Li, Jiangcheng Zhu, Jianqun Chen, Jing Chang, Kaidong Yu, Peng Liu, Qiang Liu, Shawn Yue, Senbin Yang, Shiming Yang, Tao Yu, Wen Xie, Wenhao Huang, Xiaohui Hu, Xiaoyi Ren, Xinyao Niu, Pengcheng Nie, Yuchi Xu, Yudong Liu, Yue Wang, Yuxuan Cai, Zhenyu Gu, Zhiyuan Liu, Zonghong Dai

Figure 1 for Yi: Open Foundation Models by 01.AI
Figure 2 for Yi: Open Foundation Models by 01.AI
Figure 3 for Yi: Open Foundation Models by 01.AI
Figure 4 for Yi: Open Foundation Models by 01.AI
Viaarxiv icon

Energon: Towards Efficient Acceleration of Transformers Using Dynamic Sparse Attention

Add code
Bookmark button
Alert button
Oct 18, 2021
Zhe Zhou, Junlin Liu, Zhenyu Gu, Guangyu Sun

Figure 1 for Energon: Towards Efficient Acceleration of Transformers Using Dynamic Sparse Attention
Figure 2 for Energon: Towards Efficient Acceleration of Transformers Using Dynamic Sparse Attention
Figure 3 for Energon: Towards Efficient Acceleration of Transformers Using Dynamic Sparse Attention
Figure 4 for Energon: Towards Efficient Acceleration of Transformers Using Dynamic Sparse Attention
Viaarxiv icon

Distribution Adaptive INT8 Quantization for Training CNNs

Add code
Bookmark button
Alert button
Feb 09, 2021
Kang Zhao, Sida Huang, Pan Pan, Yinghan Li, Yingya Zhang, Zhenyu Gu, Yinghui Xu

Figure 1 for Distribution Adaptive INT8 Quantization for Training CNNs
Figure 2 for Distribution Adaptive INT8 Quantization for Training CNNs
Figure 3 for Distribution Adaptive INT8 Quantization for Training CNNs
Figure 4 for Distribution Adaptive INT8 Quantization for Training CNNs
Viaarxiv icon