Alert button
Picture for Dongdong Zhang

Dongdong Zhang

Alert button

HLT-MT: High-resource Language-specific Training for Multilingual Neural Machine Translation

Add code
Bookmark button
Alert button
Jul 15, 2022
Jian Yang, Yuwei Yin, Shuming Ma, Dongdong Zhang, Zhoujun Li, Furu Wei

Figure 1 for HLT-MT: High-resource Language-specific Training for Multilingual Neural Machine Translation
Figure 2 for HLT-MT: High-resource Language-specific Training for Multilingual Neural Machine Translation
Figure 3 for HLT-MT: High-resource Language-specific Training for Multilingual Neural Machine Translation
Figure 4 for HLT-MT: High-resource Language-specific Training for Multilingual Neural Machine Translation
Viaarxiv icon

UM4: Unified Multilingual Multiple Teacher-Student Model for Zero-Resource Neural Machine Translation

Add code
Bookmark button
Alert button
Jul 11, 2022
Jian Yang, Yuwei Yin, Shuming Ma, Dongdong Zhang, Shuangzhi Wu, Hongcheng Guo, Zhoujun Li, Furu Wei

Figure 1 for UM4: Unified Multilingual Multiple Teacher-Student Model for Zero-Resource Neural Machine Translation
Figure 2 for UM4: Unified Multilingual Multiple Teacher-Student Model for Zero-Resource Neural Machine Translation
Figure 3 for UM4: Unified Multilingual Multiple Teacher-Student Model for Zero-Resource Neural Machine Translation
Figure 4 for UM4: Unified Multilingual Multiple Teacher-Student Model for Zero-Resource Neural Machine Translation
Viaarxiv icon

DeepNet: Scaling Transformers to 1,000 Layers

Add code
Bookmark button
Alert button
Mar 01, 2022
Hongyu Wang, Shuming Ma, Li Dong, Shaohan Huang, Dongdong Zhang, Furu Wei

Figure 1 for DeepNet: Scaling Transformers to 1,000 Layers
Figure 2 for DeepNet: Scaling Transformers to 1,000 Layers
Figure 3 for DeepNet: Scaling Transformers to 1,000 Layers
Figure 4 for DeepNet: Scaling Transformers to 1,000 Layers
Viaarxiv icon

Zero-shot Cross-lingual Transfer of Prompt-based Tuning with a Unified Multilingual Prompt

Add code
Bookmark button
Alert button
Feb 23, 2022
Lianzhe Huang, Shuming Ma, Dongdong Zhang, Furu Wei, Houfeng Wang

Figure 1 for Zero-shot Cross-lingual Transfer of Prompt-based Tuning with a Unified Multilingual Prompt
Figure 2 for Zero-shot Cross-lingual Transfer of Prompt-based Tuning with a Unified Multilingual Prompt
Figure 3 for Zero-shot Cross-lingual Transfer of Prompt-based Tuning with a Unified Multilingual Prompt
Figure 4 for Zero-shot Cross-lingual Transfer of Prompt-based Tuning with a Unified Multilingual Prompt
Viaarxiv icon

Phrase-level Adversarial Example Generation for Neural Machine Translation

Add code
Bookmark button
Alert button
Jan 06, 2022
Juncheng Wan, Jian Yang, Shuming Ma, Dongdong Zhang, Weinan Zhang, Yong Yu, Furu Wei

Figure 1 for Phrase-level Adversarial Example Generation for Neural Machine Translation
Figure 2 for Phrase-level Adversarial Example Generation for Neural Machine Translation
Figure 3 for Phrase-level Adversarial Example Generation for Neural Machine Translation
Figure 4 for Phrase-level Adversarial Example Generation for Neural Machine Translation
Viaarxiv icon

SMDT: Selective Memory-Augmented Neural Document Translation

Add code
Bookmark button
Alert button
Jan 05, 2022
Xu Zhang, Jian Yang, Haoyang Huang, Shuming Ma, Dongdong Zhang, Jinlong Li, Furu Wei

Figure 1 for SMDT: Selective Memory-Augmented Neural Document Translation
Figure 2 for SMDT: Selective Memory-Augmented Neural Document Translation
Figure 3 for SMDT: Selective Memory-Augmented Neural Document Translation
Figure 4 for SMDT: Selective Memory-Augmented Neural Document Translation
Viaarxiv icon

Multilingual Machine Translation Systems from Microsoft for WMT21 Shared Task

Add code
Bookmark button
Alert button
Nov 03, 2021
Jian Yang, Shuming Ma, Haoyang Huang, Dongdong Zhang, Li Dong, Shaohan Huang, Alexandre Muzio, Saksham Singhal, Hany Hassan Awadalla, Xia Song, Furu Wei

Figure 1 for Multilingual Machine Translation Systems from Microsoft for WMT21 Shared Task
Figure 2 for Multilingual Machine Translation Systems from Microsoft for WMT21 Shared Task
Figure 3 for Multilingual Machine Translation Systems from Microsoft for WMT21 Shared Task
Figure 4 for Multilingual Machine Translation Systems from Microsoft for WMT21 Shared Task
Viaarxiv icon

Towards Making the Most of Multilingual Pretraining for Zero-Shot Neural Machine Translation

Add code
Bookmark button
Alert button
Oct 16, 2021
Guanhua Chen, Shuming Ma, Yun Chen, Dongdong Zhang, Jia Pan, Wenping Wang, Furu Wei

Figure 1 for Towards Making the Most of Multilingual Pretraining for Zero-Shot Neural Machine Translation
Figure 2 for Towards Making the Most of Multilingual Pretraining for Zero-Shot Neural Machine Translation
Figure 3 for Towards Making the Most of Multilingual Pretraining for Zero-Shot Neural Machine Translation
Figure 4 for Towards Making the Most of Multilingual Pretraining for Zero-Shot Neural Machine Translation
Viaarxiv icon

DeltaLM: Encoder-Decoder Pre-training for Language Generation and Translation by Augmenting Pretrained Multilingual Encoders

Add code
Bookmark button
Alert button
Jun 25, 2021
Shuming Ma, Li Dong, Shaohan Huang, Dongdong Zhang, Alexandre Muzio, Saksham Singhal, Hany Hassan Awadalla, Xia Song, Furu Wei

Figure 1 for DeltaLM: Encoder-Decoder Pre-training for Language Generation and Translation by Augmenting Pretrained Multilingual Encoders
Figure 2 for DeltaLM: Encoder-Decoder Pre-training for Language Generation and Translation by Augmenting Pretrained Multilingual Encoders
Figure 3 for DeltaLM: Encoder-Decoder Pre-training for Language Generation and Translation by Augmenting Pretrained Multilingual Encoders
Figure 4 for DeltaLM: Encoder-Decoder Pre-training for Language Generation and Translation by Augmenting Pretrained Multilingual Encoders
Viaarxiv icon

How Does Distilled Data Complexity Impact the Quality and Confidence of Non-Autoregressive Machine Translation?

Add code
Bookmark button
Alert button
May 27, 2021
Weijia Xu, Shuming Ma, Dongdong Zhang, Marine Carpuat

Figure 1 for How Does Distilled Data Complexity Impact the Quality and Confidence of Non-Autoregressive Machine Translation?
Figure 2 for How Does Distilled Data Complexity Impact the Quality and Confidence of Non-Autoregressive Machine Translation?
Figure 3 for How Does Distilled Data Complexity Impact the Quality and Confidence of Non-Autoregressive Machine Translation?
Figure 4 for How Does Distilled Data Complexity Impact the Quality and Confidence of Non-Autoregressive Machine Translation?
Viaarxiv icon