Picture for Ahmet Üstün

Ahmet Üstün

One Tokenizer To Rule Them All: Emergent Language Plasticity via Multilingual Tokenizers

Add code
Jun 12, 2025
Viaarxiv icon

The Multilingual Divide and Its Impact on Global AI Safety

Add code
May 27, 2025
Viaarxiv icon

Aya Vision: Advancing the Frontier of Multilingual Multimodality

Add code
May 13, 2025
Viaarxiv icon

The Leaderboard Illusion

Add code
Apr 29, 2025
Viaarxiv icon

Command A: An Enterprise-Ready Large Language Model

Add code
Apr 01, 2025
Viaarxiv icon

When Personalization Meets Reality: A Multi-Faceted Analysis of Personalized Preference Learning

Add code
Feb 26, 2025
Viaarxiv icon

Aya Expanse: Combining Research Breakthroughs for a New Multilingual Frontier

Add code
Dec 05, 2024
Figure 1 for Aya Expanse: Combining Research Breakthroughs for a New Multilingual Frontier
Figure 2 for Aya Expanse: Combining Research Breakthroughs for a New Multilingual Frontier
Figure 3 for Aya Expanse: Combining Research Breakthroughs for a New Multilingual Frontier
Figure 4 for Aya Expanse: Combining Research Breakthroughs for a New Multilingual Frontier
Viaarxiv icon

If You Can't Use Them, Recycle Them: Optimizing Merging at Scale Mitigates Performance Tradeoffs

Add code
Dec 05, 2024
Figure 1 for If You Can't Use Them, Recycle Them: Optimizing Merging at Scale Mitigates Performance Tradeoffs
Figure 2 for If You Can't Use Them, Recycle Them: Optimizing Merging at Scale Mitigates Performance Tradeoffs
Figure 3 for If You Can't Use Them, Recycle Them: Optimizing Merging at Scale Mitigates Performance Tradeoffs
Figure 4 for If You Can't Use Them, Recycle Them: Optimizing Merging at Scale Mitigates Performance Tradeoffs
Viaarxiv icon

Nexus: Specialization meets Adaptability for Efficiently Training Mixture of Experts

Add code
Aug 28, 2024
Figure 1 for Nexus: Specialization meets Adaptability for Efficiently Training Mixture of Experts
Figure 2 for Nexus: Specialization meets Adaptability for Efficiently Training Mixture of Experts
Figure 3 for Nexus: Specialization meets Adaptability for Efficiently Training Mixture of Experts
Figure 4 for Nexus: Specialization meets Adaptability for Efficiently Training Mixture of Experts
Viaarxiv icon

To Code, or Not To Code? Exploring Impact of Code in Pre-training

Add code
Aug 20, 2024
Figure 1 for To Code, or Not To Code? Exploring Impact of Code in Pre-training
Figure 2 for To Code, or Not To Code? Exploring Impact of Code in Pre-training
Figure 3 for To Code, or Not To Code? Exploring Impact of Code in Pre-training
Figure 4 for To Code, or Not To Code? Exploring Impact of Code in Pre-training
Viaarxiv icon