Picture for Darius Catrina

Darius Catrina

Reverse Distillation: Consistently Scaling Protein Language Model Representations

Add code
Mar 08, 2026
Viaarxiv icon

Distilling the Knowledge of Romanian BERTs Using Multiple Teachers

Add code
Jan 11, 2022
Figure 1 for Distilling the Knowledge of Romanian BERTs Using Multiple Teachers
Figure 2 for Distilling the Knowledge of Romanian BERTs Using Multiple Teachers
Figure 3 for Distilling the Knowledge of Romanian BERTs Using Multiple Teachers
Figure 4 for Distilling the Knowledge of Romanian BERTs Using Multiple Teachers
Viaarxiv icon