Alert button
Picture for Junqiu Wei

Junqiu Wei

Alert button

Training Multilingual Pre-trained Language Model with Byte-level Subwords

Add code
Bookmark button
Alert button
Jan 23, 2021
Junqiu Wei, Qun Liu, Yinpeng Guo, Xin Jiang

Figure 1 for Training Multilingual Pre-trained Language Model with Byte-level Subwords
Figure 2 for Training Multilingual Pre-trained Language Model with Byte-level Subwords
Figure 3 for Training Multilingual Pre-trained Language Model with Byte-level Subwords
Figure 4 for Training Multilingual Pre-trained Language Model with Byte-level Subwords
Viaarxiv icon

TensorCoder: Dimension-Wise Attention via Tensor Representation for Natural Language Modeling

Add code
Bookmark button
Alert button
Aug 12, 2020
Shuai Zhang, Peng Zhang, Xindian Ma, Junqiu Wei, Ningning Wang, Qun Liu

Figure 1 for TensorCoder: Dimension-Wise Attention via Tensor Representation for Natural Language Modeling
Figure 2 for TensorCoder: Dimension-Wise Attention via Tensor Representation for Natural Language Modeling
Figure 3 for TensorCoder: Dimension-Wise Attention via Tensor Representation for Natural Language Modeling
Figure 4 for TensorCoder: Dimension-Wise Attention via Tensor Representation for Natural Language Modeling
Viaarxiv icon

NEZHA: Neural Contextualized Representation for Chinese Language Understanding

Add code
Bookmark button
Alert button
Sep 05, 2019
Junqiu Wei, Xiaozhe Ren, Xiaoguang Li, Wenyong Huang, Yi Liao, Yasheng Wang, Jiashu Lin, Xin Jiang, Xiao Chen, Qun Liu

Figure 1 for NEZHA: Neural Contextualized Representation for Chinese Language Understanding
Figure 2 for NEZHA: Neural Contextualized Representation for Chinese Language Understanding
Figure 3 for NEZHA: Neural Contextualized Representation for Chinese Language Understanding
Figure 4 for NEZHA: Neural Contextualized Representation for Chinese Language Understanding
Viaarxiv icon