Alert button
Picture for Yadao Wang

Yadao Wang

Alert button

PanGu-Σ: Towards Trillion Parameter Language Model with Sparse Heterogeneous Computing

Add code
Bookmark button
Alert button
Mar 20, 2023
Xiaozhe Ren, Pingyi Zhou, Xinfan Meng, Xinjing Huang, Yadao Wang, Weichao Wang, Pengfei Li, Xiaoda Zhang, Alexander Podolskiy, Grigory Arshinov, Andrey Bout, Irina Piontkovskaya, Jiansheng Wei, Xin Jiang, Teng Su, Qun Liu, Jun Yao

Figure 1 for PanGu-Σ: Towards Trillion Parameter Language Model with Sparse Heterogeneous Computing
Figure 2 for PanGu-Σ: Towards Trillion Parameter Language Model with Sparse Heterogeneous Computing
Figure 3 for PanGu-Σ: Towards Trillion Parameter Language Model with Sparse Heterogeneous Computing
Figure 4 for PanGu-Σ: Towards Trillion Parameter Language Model with Sparse Heterogeneous Computing
Viaarxiv icon

Sparse Structure Search for Parameter-Efficient Tuning

Add code
Bookmark button
Alert button
Jun 15, 2022
Shengding Hu, Zhen Zhang, Ning Ding, Yadao Wang, Yasheng Wang, Zhiyuan Liu, Maosong Sun

Figure 1 for Sparse Structure Search for Parameter-Efficient Tuning
Figure 2 for Sparse Structure Search for Parameter-Efficient Tuning
Figure 3 for Sparse Structure Search for Parameter-Efficient Tuning
Figure 4 for Sparse Structure Search for Parameter-Efficient Tuning
Viaarxiv icon

HyperPELT: Unified Parameter-Efficient Language Model Tuning for Both Language and Vision-and-Language Tasks

Add code
Bookmark button
Alert button
Mar 08, 2022
Zhengkun Zhang, Wenya Guo, Xiaojun Meng, Yasheng Wang, Yadao Wang, Xin Jiang, Qun Liu, Zhenglu Yang

Figure 1 for HyperPELT: Unified Parameter-Efficient Language Model Tuning for Both Language and Vision-and-Language Tasks
Figure 2 for HyperPELT: Unified Parameter-Efficient Language Model Tuning for Both Language and Vision-and-Language Tasks
Figure 3 for HyperPELT: Unified Parameter-Efficient Language Model Tuning for Both Language and Vision-and-Language Tasks
Figure 4 for HyperPELT: Unified Parameter-Efficient Language Model Tuning for Both Language and Vision-and-Language Tasks
Viaarxiv icon

CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model

Add code
Bookmark button
Alert button
Aug 23, 2021
Xin Wang, Yasheng Wang, Pingyi Zhou, Fei Mi, Meng Xiao, Yadao Wang, Li Li, Xiao Liu, Hao Wu, Jin Liu, Xin Jiang

Figure 1 for CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model
Figure 2 for CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model
Figure 3 for CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model
Figure 4 for CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model
Viaarxiv icon