Picture for Weiting Tan

Weiting Tan

It Takes Two: On the Seamlessness between Reward and Policy Model in RLHF

Add code
Jun 12, 2024
Viaarxiv icon

DiffNorm: Self-Supervised Normalization for Non-autoregressive Speech-to-speech Translation

Add code
May 22, 2024
Viaarxiv icon

Streaming Sequence Transduction through Dynamic Compression

Add code
Feb 02, 2024
Viaarxiv icon

Contrastive Preference Optimization: Pushing the Boundaries of LLM Performance in Machine Translation

Add code
Feb 02, 2024
Viaarxiv icon

The Language Barrier: Dissecting Safety Challenges of LLMs in Multilingual Contexts

Add code
Jan 23, 2024
Viaarxiv icon

Structure-Aware Path Inference for Neural Finite State Transducers

Dec 21, 2023
Viaarxiv icon

Narrowing the Gap between Zero- and Few-shot Machine Translation by Matching Styles

Nov 04, 2023
Viaarxiv icon

Condensing Multilingual Knowledge with Lightweight Language-Specific Modules

Add code
May 23, 2023
Figure 1 for Condensing Multilingual Knowledge with Lightweight Language-Specific Modules
Figure 2 for Condensing Multilingual Knowledge with Lightweight Language-Specific Modules
Figure 3 for Condensing Multilingual Knowledge with Lightweight Language-Specific Modules
Figure 4 for Condensing Multilingual Knowledge with Lightweight Language-Specific Modules
Viaarxiv icon

Flatness-Aware Prompt Selection Improves Accuracy and Sample Efficiency

Add code
May 18, 2023
Figure 1 for Flatness-Aware Prompt Selection Improves Accuracy and Sample Efficiency
Figure 2 for Flatness-Aware Prompt Selection Improves Accuracy and Sample Efficiency
Figure 3 for Flatness-Aware Prompt Selection Improves Accuracy and Sample Efficiency
Figure 4 for Flatness-Aware Prompt Selection Improves Accuracy and Sample Efficiency
Viaarxiv icon

Multilingual Representation Distillation with Contrastive Learning

Add code
Oct 10, 2022
Figure 1 for Multilingual Representation Distillation with Contrastive Learning
Figure 2 for Multilingual Representation Distillation with Contrastive Learning
Figure 3 for Multilingual Representation Distillation with Contrastive Learning
Figure 4 for Multilingual Representation Distillation with Contrastive Learning
Viaarxiv icon