Picture for Junqiu Wei

Junqiu Wei

Training Multilingual Pre-trained Language Model with Byte-level Subwords

Add code
Jan 23, 2021
Figure 1 for Training Multilingual Pre-trained Language Model with Byte-level Subwords
Figure 2 for Training Multilingual Pre-trained Language Model with Byte-level Subwords
Figure 3 for Training Multilingual Pre-trained Language Model with Byte-level Subwords
Figure 4 for Training Multilingual Pre-trained Language Model with Byte-level Subwords
Viaarxiv icon

TensorCoder: Dimension-Wise Attention via Tensor Representation for Natural Language Modeling

Add code
Aug 12, 2020
Figure 1 for TensorCoder: Dimension-Wise Attention via Tensor Representation for Natural Language Modeling
Figure 2 for TensorCoder: Dimension-Wise Attention via Tensor Representation for Natural Language Modeling
Figure 3 for TensorCoder: Dimension-Wise Attention via Tensor Representation for Natural Language Modeling
Figure 4 for TensorCoder: Dimension-Wise Attention via Tensor Representation for Natural Language Modeling
Viaarxiv icon

NEZHA: Neural Contextualized Representation for Chinese Language Understanding

Add code
Sep 05, 2019
Figure 1 for NEZHA: Neural Contextualized Representation for Chinese Language Understanding
Figure 2 for NEZHA: Neural Contextualized Representation for Chinese Language Understanding
Figure 3 for NEZHA: Neural Contextualized Representation for Chinese Language Understanding
Figure 4 for NEZHA: Neural Contextualized Representation for Chinese Language Understanding
Viaarxiv icon