Alert button
Picture for Aditya Krishna Menon

Aditya Krishna Menon

Alert button

Metric-aware LLM inference

Add code
Bookmark button
Alert button
Mar 07, 2024
Michal Lukasik, Harikrishna Narasimhan, Aditya Krishna Menon, Felix Yu, Sanjiv Kumar

Figure 1 for Metric-aware LLM inference
Figure 2 for Metric-aware LLM inference
Figure 3 for Metric-aware LLM inference
Figure 4 for Metric-aware LLM inference
Viaarxiv icon

DistillSpec: Improving Speculative Decoding via Knowledge Distillation

Add code
Bookmark button
Alert button
Oct 12, 2023
Yongchao Zhou, Kaifeng Lyu, Ankit Singh Rawat, Aditya Krishna Menon, Afshin Rostamizadeh, Sanjiv Kumar, Jean-François Kagy, Rishabh Agarwal

Figure 1 for DistillSpec: Improving Speculative Decoding via Knowledge Distillation
Figure 2 for DistillSpec: Improving Speculative Decoding via Knowledge Distillation
Figure 3 for DistillSpec: Improving Speculative Decoding via Knowledge Distillation
Figure 4 for DistillSpec: Improving Speculative Decoding via Knowledge Distillation
Viaarxiv icon

What do larger image classifiers memorise?

Add code
Bookmark button
Alert button
Oct 09, 2023
Michal Lukasik, Vaishnavh Nagarajan, Ankit Singh Rawat, Aditya Krishna Menon, Sanjiv Kumar

Figure 1 for What do larger image classifiers memorise?
Figure 2 for What do larger image classifiers memorise?
Figure 3 for What do larger image classifiers memorise?
Figure 4 for What do larger image classifiers memorise?
Viaarxiv icon

Think before you speak: Training Language Models With Pause Tokens

Add code
Bookmark button
Alert button
Oct 03, 2023
Sachin Goyal, Ziwei Ji, Ankit Singh Rawat, Aditya Krishna Menon, Sanjiv Kumar, Vaishnavh Nagarajan

Figure 1 for Think before you speak: Training Language Models With Pause Tokens
Figure 2 for Think before you speak: Training Language Models With Pause Tokens
Figure 3 for Think before you speak: Training Language Models With Pause Tokens
Figure 4 for Think before you speak: Training Language Models With Pause Tokens
Viaarxiv icon

The importance of feature preprocessing for differentially private linear optimization

Add code
Bookmark button
Alert button
Jul 19, 2023
Ziteng Sun, Ananda Theertha Suresh, Aditya Krishna Menon

Viaarxiv icon

When Does Confidence-Based Cascade Deferral Suffice?

Add code
Bookmark button
Alert button
Jul 06, 2023
Wittawat Jitkrittum, Neha Gupta, Aditya Krishna Menon, Harikrishna Narasimhan, Ankit Singh Rawat, Sanjiv Kumar

Figure 1 for When Does Confidence-Based Cascade Deferral Suffice?
Figure 2 for When Does Confidence-Based Cascade Deferral Suffice?
Figure 3 for When Does Confidence-Based Cascade Deferral Suffice?
Figure 4 for When Does Confidence-Based Cascade Deferral Suffice?
Viaarxiv icon

ResMem: Learn what you can and memorize the rest

Add code
Bookmark button
Alert button
Feb 03, 2023
Zitong Yang, Michal Lukasik, Vaishnavh Nagarajan, Zonglin Li, Ankit Singh Rawat, Manzil Zaheer, Aditya Krishna Menon, Sanjiv Kumar

Figure 1 for ResMem: Learn what you can and memorize the rest
Figure 2 for ResMem: Learn what you can and memorize the rest
Figure 3 for ResMem: Learn what you can and memorize the rest
Figure 4 for ResMem: Learn what you can and memorize the rest
Viaarxiv icon

Learning to reject meets OOD detection: Are all abstentions created equal?

Add code
Bookmark button
Alert button
Jan 31, 2023
Harikrishna Narasimhan, Aditya Krishna Menon, Wittawat Jitkrittum, Sanjiv Kumar

Figure 1 for Learning to reject meets OOD detection: Are all abstentions created equal?
Figure 2 for Learning to reject meets OOD detection: Are all abstentions created equal?
Figure 3 for Learning to reject meets OOD detection: Are all abstentions created equal?
Figure 4 for Learning to reject meets OOD detection: Are all abstentions created equal?
Viaarxiv icon

On student-teacher deviations in distillation: does it pay to disobey?

Add code
Bookmark button
Alert button
Jan 30, 2023
Vaishnavh Nagarajan, Aditya Krishna Menon, Srinadh Bhojanapalli, Hossein Mobahi, Sanjiv Kumar

Figure 1 for On student-teacher deviations in distillation: does it pay to disobey?
Figure 2 for On student-teacher deviations in distillation: does it pay to disobey?
Figure 3 for On student-teacher deviations in distillation: does it pay to disobey?
Figure 4 for On student-teacher deviations in distillation: does it pay to disobey?
Viaarxiv icon

Supervision Complexity and its Role in Knowledge Distillation

Add code
Bookmark button
Alert button
Jan 28, 2023
Hrayr Harutyunyan, Ankit Singh Rawat, Aditya Krishna Menon, Seungyeon Kim, Sanjiv Kumar

Figure 1 for Supervision Complexity and its Role in Knowledge Distillation
Figure 2 for Supervision Complexity and its Role in Knowledge Distillation
Figure 3 for Supervision Complexity and its Role in Knowledge Distillation
Figure 4 for Supervision Complexity and its Role in Knowledge Distillation
Viaarxiv icon