RADLADS: Rapid Attention Distillation to Linear Attention Decoders at Scale

Add code
May 07, 2025

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: