Alert button
Picture for Xinwei Wu

Xinwei Wu

Alert button

Exploring Multilingual Human Value Concepts in Large Language Models: Is Value Alignment Consistent, Transferable and Controllable across Languages?

Add code
Bookmark button
Alert button
Feb 28, 2024
Shaoyang Xu, Weilong Dong, Zishan Guo, Xinwei Wu, Deyi Xiong

Viaarxiv icon

DEPN: Detecting and Editing Privacy Neurons in Pretrained Language Models

Add code
Bookmark button
Alert button
Oct 31, 2023
Xinwei Wu, Junzhuo Li, Minghui Xu, Weilong Dong, Shuangzhi Wu, Chao Bian, Deyi Xiong

Viaarxiv icon

Large Language Model Alignment: A Survey

Add code
Bookmark button
Alert button
Sep 26, 2023
Tianhao Shen, Renren Jin, Yufei Huang, Chuang Liu, Weilong Dong, Zishan Guo, Xinwei Wu, Yan Liu, Deyi Xiong

Figure 1 for Large Language Model Alignment: A Survey
Figure 2 for Large Language Model Alignment: A Survey
Figure 3 for Large Language Model Alignment: A Survey
Figure 4 for Large Language Model Alignment: A Survey
Viaarxiv icon

FewFedWeight: Few-shot Federated Learning Framework across Multiple NLP Tasks

Add code
Bookmark button
Alert button
Dec 16, 2022
Weilong Dong, Xinwei Wu, Junzhuo Li, Shuangzhi Wu, Chao Bian, Deyi Xiong

Figure 1 for FewFedWeight: Few-shot Federated Learning Framework across Multiple NLP Tasks
Figure 2 for FewFedWeight: Few-shot Federated Learning Framework across Multiple NLP Tasks
Figure 3 for FewFedWeight: Few-shot Federated Learning Framework across Multiple NLP Tasks
Figure 4 for FewFedWeight: Few-shot Federated Learning Framework across Multiple NLP Tasks
Viaarxiv icon

Swing Distillation: A Privacy-Preserving Knowledge Distillation Framework

Add code
Bookmark button
Alert button
Dec 16, 2022
Junzhuo Li, Xinwei Wu, Weilong Dong, Shuangzhi Wu, Chao Bian, Deyi Xiong

Figure 1 for Swing Distillation: A Privacy-Preserving Knowledge Distillation Framework
Figure 2 for Swing Distillation: A Privacy-Preserving Knowledge Distillation Framework
Figure 3 for Swing Distillation: A Privacy-Preserving Knowledge Distillation Framework
Figure 4 for Swing Distillation: A Privacy-Preserving Knowledge Distillation Framework
Viaarxiv icon