Picture for Hanyu Zhao

Hanyu Zhao

Beyond IID: Optimizing Instruction Learning from the Perspective of Instruction Interaction and Dependency

Add code
Sep 11, 2024
Viaarxiv icon

AquilaMoE: Efficient Training for MoE Models with Scale-Up and Scale-Out Strategies

Add code
Aug 13, 2024
Viaarxiv icon

Boosting Large-scale Parallel Training Efficiency with C4: A Communication-Driven Approach

Add code
Jun 07, 2024
Viaarxiv icon

Llumnix: Dynamic Scheduling for Large Language Model Serving

Add code
Jun 05, 2024
Viaarxiv icon

Variational Continual Test-Time Adaptation

Add code
Feb 13, 2024
Viaarxiv icon

ROAM: memory-efficient large DNN training via optimized operator ordering and memory layout

Add code
Oct 30, 2023
Viaarxiv icon

Artificial Intelligence Security Competition (AISC)

Add code
Dec 07, 2022
Viaarxiv icon

Instance-wise Prompt Tuning for Pretrained Language Models

Add code
Jun 04, 2022
Figure 1 for Instance-wise Prompt Tuning for Pretrained Language Models
Figure 2 for Instance-wise Prompt Tuning for Pretrained Language Models
Figure 3 for Instance-wise Prompt Tuning for Pretrained Language Models
Figure 4 for Instance-wise Prompt Tuning for Pretrained Language Models
Viaarxiv icon

A Roadmap for Big Model

Add code
Apr 02, 2022
Figure 1 for A Roadmap for Big Model
Figure 2 for A Roadmap for Big Model
Figure 3 for A Roadmap for Big Model
Figure 4 for A Roadmap for Big Model
Viaarxiv icon

WuDaoMM: A large-scale Multi-Modal Dataset for Pre-training models

Add code
Mar 30, 2022
Figure 1 for WuDaoMM: A large-scale Multi-Modal Dataset for Pre-training models
Figure 2 for WuDaoMM: A large-scale Multi-Modal Dataset for Pre-training models
Figure 3 for WuDaoMM: A large-scale Multi-Modal Dataset for Pre-training models
Figure 4 for WuDaoMM: A large-scale Multi-Modal Dataset for Pre-training models
Viaarxiv icon