Picture for Ves Stoyanov

Ves Stoyanov

Supervised Contrastive Learning for Pre-trained Language Model Fine-tuning

Add code
Nov 12, 2020
Figure 1 for Supervised Contrastive Learning for Pre-trained Language Model Fine-tuning
Figure 2 for Supervised Contrastive Learning for Pre-trained Language Model Fine-tuning
Figure 3 for Supervised Contrastive Learning for Pre-trained Language Model Fine-tuning
Figure 4 for Supervised Contrastive Learning for Pre-trained Language Model Fine-tuning
Viaarxiv icon

Self-training Improves Pre-training for Natural Language Understanding

Add code
Oct 05, 2020
Figure 1 for Self-training Improves Pre-training for Natural Language Understanding
Figure 2 for Self-training Improves Pre-training for Natural Language Understanding
Figure 3 for Self-training Improves Pre-training for Natural Language Understanding
Figure 4 for Self-training Improves Pre-training for Natural Language Understanding
Viaarxiv icon

Conversational Semantic Parsing

Add code
Sep 28, 2020
Figure 1 for Conversational Semantic Parsing
Figure 2 for Conversational Semantic Parsing
Figure 3 for Conversational Semantic Parsing
Figure 4 for Conversational Semantic Parsing
Viaarxiv icon

Preserving Integrity in Online Social Networks

Add code
Sep 25, 2020
Figure 1 for Preserving Integrity in Online Social Networks
Figure 2 for Preserving Integrity in Online Social Networks
Figure 3 for Preserving Integrity in Online Social Networks
Figure 4 for Preserving Integrity in Online Social Networks
Viaarxiv icon

BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

Add code
Oct 29, 2019
Figure 1 for BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
Figure 2 for BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
Figure 3 for BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
Figure 4 for BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
Viaarxiv icon