Alert button
Picture for Xuejun Zhang

Xuejun Zhang

Alert button

Ask Question First for Enhancing Lifelong Language Learning

Add code
Bookmark button
Alert button
Aug 17, 2022
Han Wang, Ruiliu Fu, Xuejun Zhang, Jun Zhou, Qingwei Zhao

Figure 1 for Ask Question First for Enhancing Lifelong Language Learning
Figure 2 for Ask Question First for Enhancing Lifelong Language Learning
Figure 3 for Ask Question First for Enhancing Lifelong Language Learning
Figure 4 for Ask Question First for Enhancing Lifelong Language Learning
Viaarxiv icon

RVAE-LAMOL: Residual Variational Autoencoder to Enhance Lifelong Language Learning

Add code
Bookmark button
Alert button
May 22, 2022
Han Wang, Ruiliu Fu, Xuejun Zhang, Jun Zhou

Figure 1 for RVAE-LAMOL: Residual Variational Autoencoder to Enhance Lifelong Language Learning
Figure 2 for RVAE-LAMOL: Residual Variational Autoencoder to Enhance Lifelong Language Learning
Figure 3 for RVAE-LAMOL: Residual Variational Autoencoder to Enhance Lifelong Language Learning
Figure 4 for RVAE-LAMOL: Residual Variational Autoencoder to Enhance Lifelong Language Learning
Viaarxiv icon

Decomposing Complex Questions Makes Multi-Hop QA Easier and More Interpretable

Add code
Bookmark button
Alert button
Oct 26, 2021
Ruiliu Fu, Han Wang, Xuejun Zhang, Jun Zhou, Yonghong Yan

Figure 1 for Decomposing Complex Questions Makes Multi-Hop QA Easier and More Interpretable
Figure 2 for Decomposing Complex Questions Makes Multi-Hop QA Easier and More Interpretable
Figure 3 for Decomposing Complex Questions Makes Multi-Hop QA Easier and More Interpretable
Figure 4 for Decomposing Complex Questions Makes Multi-Hop QA Easier and More Interpretable
Viaarxiv icon

Reminding the Incremental Language Model via Data-Free Self-Distillation

Add code
Bookmark button
Alert button
Oct 17, 2021
Han Wang, Ruiliu Fu, Chengzhang Li, Xuejun Zhang, Jun Zhou, Yonghong Yan

Figure 1 for Reminding the Incremental Language Model via Data-Free Self-Distillation
Figure 2 for Reminding the Incremental Language Model via Data-Free Self-Distillation
Figure 3 for Reminding the Incremental Language Model via Data-Free Self-Distillation
Figure 4 for Reminding the Incremental Language Model via Data-Free Self-Distillation
Viaarxiv icon