Picture for Chenze Shao

Chenze Shao

Patch-Level Training for Large Language Models

Add code
Jul 17, 2024
Viaarxiv icon

Understanding and Addressing the Under-Translation Problem from the Perspective of Decoding Objective

Add code
May 29, 2024
Viaarxiv icon

Language Generation with Strictly Proper Scoring Rules

Add code
May 29, 2024
Viaarxiv icon

Non-autoregressive Machine Translation with Probabilistic Context-free Grammar

Add code
Nov 14, 2023
Viaarxiv icon

Beyond MLE: Convex Learning for Text Generation

Add code
Oct 26, 2023
Viaarxiv icon

Non-autoregressive Streaming Transformer for Simultaneous Translation

Add code
Oct 23, 2023
Viaarxiv icon

Fuzzy Alignments in Directed Acyclic Graph for Non-Autoregressive Machine Translation

Add code
Mar 12, 2023
Viaarxiv icon

Rephrasing the Reference for Non-Autoregressive Machine Translation

Add code
Nov 30, 2022
Viaarxiv icon

Viterbi Decoding of Directed Acyclic Transformer for Non-Autoregressive Machine Translation

Add code
Oct 11, 2022
Figure 1 for Viterbi Decoding of Directed Acyclic Transformer for Non-Autoregressive Machine Translation
Figure 2 for Viterbi Decoding of Directed Acyclic Transformer for Non-Autoregressive Machine Translation
Figure 3 for Viterbi Decoding of Directed Acyclic Transformer for Non-Autoregressive Machine Translation
Figure 4 for Viterbi Decoding of Directed Acyclic Transformer for Non-Autoregressive Machine Translation
Viaarxiv icon

Non-Monotonic Latent Alignments for CTC-Based Non-Autoregressive Machine Translation

Add code
Oct 08, 2022
Figure 1 for Non-Monotonic Latent Alignments for CTC-Based Non-Autoregressive Machine Translation
Figure 2 for Non-Monotonic Latent Alignments for CTC-Based Non-Autoregressive Machine Translation
Figure 3 for Non-Monotonic Latent Alignments for CTC-Based Non-Autoregressive Machine Translation
Figure 4 for Non-Monotonic Latent Alignments for CTC-Based Non-Autoregressive Machine Translation
Viaarxiv icon