Alert button
Picture for Yongfeng Huang

Yongfeng Huang

Alert button

Fastformer: Additive Attention Can Be All You Need

Add code
Bookmark button
Alert button
Sep 05, 2021
Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang, Xing Xie

Figure 1 for Fastformer: Additive Attention Can Be All You Need
Figure 2 for Fastformer: Additive Attention Can Be All You Need
Figure 3 for Fastformer: Additive Attention Can Be All You Need
Figure 4 for Fastformer: Additive Attention Can Be All You Need
Viaarxiv icon

UserBERT: Contrastive User Model Pre-training

Add code
Bookmark button
Alert button
Sep 03, 2021
Chuhan Wu, Fangzhao Wu, Yang Yu, Tao Qi, Yongfeng Huang, Xing Xie

Figure 1 for UserBERT: Contrastive User Model Pre-training
Figure 2 for UserBERT: Contrastive User Model Pre-training
Figure 3 for UserBERT: Contrastive User Model Pre-training
Figure 4 for UserBERT: Contrastive User Model Pre-training
Viaarxiv icon

Smart Bird: Learnable Sparse Attention for Efficient and Effective Transformer

Add code
Bookmark button
Alert button
Sep 02, 2021
Chuhan Wu, Fangzhao Wu, Tao Qi, Binxing Jiao, Daxin Jiang, Yongfeng Huang, Xing Xie

Figure 1 for Smart Bird: Learnable Sparse Attention for Efficient and Effective Transformer
Figure 2 for Smart Bird: Learnable Sparse Attention for Efficient and Effective Transformer
Figure 3 for Smart Bird: Learnable Sparse Attention for Efficient and Effective Transformer
Figure 4 for Smart Bird: Learnable Sparse Attention for Efficient and Effective Transformer
Viaarxiv icon

FedKD: Communication Efficient Federated Learning via Knowledge Distillation

Add code
Bookmark button
Alert button
Aug 30, 2021
Chuhan Wu, Fangzhao Wu, Ruixuan Liu, Lingjuan Lyu, Yongfeng Huang, Xing Xie

Figure 1 for FedKD: Communication Efficient Federated Learning via Knowledge Distillation
Figure 2 for FedKD: Communication Efficient Federated Learning via Knowledge Distillation
Figure 3 for FedKD: Communication Efficient Federated Learning via Knowledge Distillation
Figure 4 for FedKD: Communication Efficient Federated Learning via Knowledge Distillation
Viaarxiv icon

Is News Recommendation a Sequential Recommendation Task?

Add code
Bookmark button
Alert button
Aug 26, 2021
Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang

Figure 1 for Is News Recommendation a Sequential Recommendation Task?
Figure 2 for Is News Recommendation a Sequential Recommendation Task?
Figure 3 for Is News Recommendation a Sequential Recommendation Task?
Figure 4 for Is News Recommendation a Sequential Recommendation Task?
Viaarxiv icon

Fastformer: Additive Attention is All You Need

Add code
Bookmark button
Alert button
Aug 20, 2021
Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang

Figure 1 for Fastformer: Additive Attention is All You Need
Figure 2 for Fastformer: Additive Attention is All You Need
Figure 3 for Fastformer: Additive Attention is All You Need
Figure 4 for Fastformer: Additive Attention is All You Need
Viaarxiv icon