Picture for Yichang Zhang

Yichang Zhang

Omni-MATH: A Universal Olympiad Level Mathematic Benchmark For Large Language Models

Add code
Oct 10, 2024
Viaarxiv icon

Qwen2 Technical Report

Add code
Jul 16, 2024
Viaarxiv icon

Can Large Language Models Always Solve Easy Problems if They Can Solve Harder Ones?

Add code
Jun 18, 2024
Viaarxiv icon

Qwen Technical Report

Add code
Sep 28, 2023
Figure 1 for Qwen Technical Report
Figure 2 for Qwen Technical Report
Figure 3 for Qwen Technical Report
Figure 4 for Qwen Technical Report
Viaarxiv icon

Transferring General Multimodal Pretrained Models to Text Recognition

Add code
Dec 19, 2022
Viaarxiv icon

OFASys: A Multi-Modal Multi-Task Learning System for Building Generalist Models

Add code
Dec 08, 2022
Viaarxiv icon

Chinese CLIP: Contrastive Vision-Language Pretraining in Chinese

Add code
Nov 03, 2022
Viaarxiv icon

Sketch and Refine: Towards Faithful and Informative Table-to-Text Generation

Add code
May 31, 2021
Figure 1 for Sketch and Refine: Towards Faithful and Informative Table-to-Text Generation
Figure 2 for Sketch and Refine: Towards Faithful and Informative Table-to-Text Generation
Figure 3 for Sketch and Refine: Towards Faithful and Informative Table-to-Text Generation
Figure 4 for Sketch and Refine: Towards Faithful and Informative Table-to-Text Generation
Viaarxiv icon

M6: A Chinese Multimodal Pretrainer

Add code
Mar 02, 2021
Figure 1 for M6: A Chinese Multimodal Pretrainer
Figure 2 for M6: A Chinese Multimodal Pretrainer
Figure 3 for M6: A Chinese Multimodal Pretrainer
Figure 4 for M6: A Chinese Multimodal Pretrainer
Viaarxiv icon

Meta-KD: A Meta Knowledge Distillation Framework for Language Model Compression across Domains

Add code
Dec 02, 2020
Figure 1 for Meta-KD: A Meta Knowledge Distillation Framework for Language Model Compression across Domains
Figure 2 for Meta-KD: A Meta Knowledge Distillation Framework for Language Model Compression across Domains
Figure 3 for Meta-KD: A Meta Knowledge Distillation Framework for Language Model Compression across Domains
Figure 4 for Meta-KD: A Meta Knowledge Distillation Framework for Language Model Compression across Domains
Viaarxiv icon