Alert button
Picture for Jingqiao Zhang

Jingqiao Zhang

Alert button

GUIM -- General User and Item Embedding with Mixture of Representation in E-commerce

Add code
Bookmark button
Alert button
Jul 02, 2022
Chao Yang, Ru He, Fangquan Lin, Suoyuan Song, Jingqiao Zhang, Cheng Yang

Figure 1 for GUIM -- General User and Item Embedding with Mixture of Representation in E-commerce
Figure 2 for GUIM -- General User and Item Embedding with Mixture of Representation in E-commerce
Figure 3 for GUIM -- General User and Item Embedding with Mixture of Representation in E-commerce
Figure 4 for GUIM -- General User and Item Embedding with Mixture of Representation in E-commerce
Viaarxiv icon

Improving Contrastive Learning of Sentence Embeddings with Case-Augmented Positives and Retrieved Negatives

Add code
Bookmark button
Alert button
Jun 06, 2022
Wei Wang, Liangzhu Ge, Jingqiao Zhang, Cheng Yang

Figure 1 for Improving Contrastive Learning of Sentence Embeddings with Case-Augmented Positives and Retrieved Negatives
Figure 2 for Improving Contrastive Learning of Sentence Embeddings with Case-Augmented Positives and Retrieved Negatives
Figure 3 for Improving Contrastive Learning of Sentence Embeddings with Case-Augmented Positives and Retrieved Negatives
Figure 4 for Improving Contrastive Learning of Sentence Embeddings with Case-Augmented Positives and Retrieved Negatives
Viaarxiv icon

SAS: Self-Augmented Strategy for Language Model Pre-training

Add code
Bookmark button
Alert button
Jun 14, 2021
Yifei Xu, Jingqiao Zhang, Ru He, Liangzhu Ge, Chao Yang, Cheng Yang, Ying Nian Wu

Figure 1 for SAS: Self-Augmented Strategy for Language Model Pre-training
Figure 2 for SAS: Self-Augmented Strategy for Language Model Pre-training
Figure 3 for SAS: Self-Augmented Strategy for Language Model Pre-training
Figure 4 for SAS: Self-Augmented Strategy for Language Model Pre-training
Viaarxiv icon

Progressively Stacking 2.0: A Multi-stage Layerwise Training Method for BERT Training Speedup

Add code
Bookmark button
Alert button
Nov 27, 2020
Cheng Yang, Shengnan Wang, Chao Yang, Yuechuan Li, Ru He, Jingqiao Zhang

Figure 1 for Progressively Stacking 2.0: A Multi-stage Layerwise Training Method for BERT Training Speedup
Figure 2 for Progressively Stacking 2.0: A Multi-stage Layerwise Training Method for BERT Training Speedup
Figure 3 for Progressively Stacking 2.0: A Multi-stage Layerwise Training Method for BERT Training Speedup
Figure 4 for Progressively Stacking 2.0: A Multi-stage Layerwise Training Method for BERT Training Speedup
Viaarxiv icon

CoRe: An Efficient Coarse-refined Training Framework for BERT

Add code
Bookmark button
Alert button
Nov 27, 2020
Cheng Yang, Shengnan Wang, Yuechuan Li, Chao Yang, Ming Yan, Jingqiao Zhang, Fangquan Lin

Figure 1 for CoRe: An Efficient Coarse-refined Training Framework for BERT
Figure 2 for CoRe: An Efficient Coarse-refined Training Framework for BERT
Figure 3 for CoRe: An Efficient Coarse-refined Training Framework for BERT
Figure 4 for CoRe: An Efficient Coarse-refined Training Framework for BERT
Viaarxiv icon