Picture for Kou Misaki

Kou Misaki

Wider or Deeper? Scaling LLM Inference-Time Compute with Adaptive Branching Tree Search

Add code
Mar 06, 2025
Viaarxiv icon

TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models

Add code
Jan 29, 2025
Figure 1 for TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models
Figure 2 for TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models
Figure 3 for TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models
Figure 4 for TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models
Viaarxiv icon