Alert button
Picture for Preetum Nakkiran

Preetum Nakkiran

Alert button

Perspectives on the State and Future of Deep Learning - 2023

Dec 19, 2023
Micah Goldblum, Anima Anandkumar, Richard Baraniuk, Tom Goldstein, Kyunghyun Cho, Zachary C Lipton, Melanie Mitchell, Preetum Nakkiran, Max Welling, Andrew Gordon Wilson

Viaarxiv icon

LiDAR: Sensing Linear Probing Performance in Joint Embedding SSL Architectures

Dec 07, 2023
Vimal Thilak, Chen Huang, Omid Saremi, Laurent Dinh, Hanlin Goh, Preetum Nakkiran, Joshua M. Susskind, Etai Littwin

Viaarxiv icon

Vanishing Gradients in Reinforcement Finetuning of Language Models

Oct 31, 2023
Noam Razin, Hattie Zhou, Omid Saremi, Vimal Thilak, Arwen Bradley, Preetum Nakkiran, Joshua Susskind, Etai Littwin

Viaarxiv icon

What Algorithms can Transformers Learn? A Study in Length Generalization

Oct 24, 2023
Hattie Zhou, Arwen Bradley, Etai Littwin, Noam Razin, Omid Saremi, Josh Susskind, Samy Bengio, Preetum Nakkiran

Viaarxiv icon

Smooth ECE: Principled Reliability Diagrams via Kernel Smoothing

Sep 21, 2023
Jarosław Błasiok, Preetum Nakkiran

Viaarxiv icon

When Does Optimizing a Proper Loss Yield Calibration?

May 30, 2023
Jarosław Błasiok, Parikshit Gopalan, Lunjia Hu, Preetum Nakkiran

Figure 1 for When Does Optimizing a Proper Loss Yield Calibration?
Figure 2 for When Does Optimizing a Proper Loss Yield Calibration?
Viaarxiv icon

Loss minimization yields multicalibration for large neural networks

Apr 19, 2023
Jarosław Błasiok, Parikshit Gopalan, Lunjia Hu, Adam Tauman Kalai, Preetum Nakkiran

Viaarxiv icon

A Unifying Theory of Distance from Calibration

Nov 30, 2022
Jarosław Błasiok, Parikshit Gopalan, Lunjia Hu, Preetum Nakkiran

Figure 1 for A Unifying Theory of Distance from Calibration
Figure 2 for A Unifying Theory of Distance from Calibration
Figure 3 for A Unifying Theory of Distance from Calibration
Figure 4 for A Unifying Theory of Distance from Calibration
Viaarxiv icon

APE: Aligning Pretrained Encoders to Quickly Learn Aligned Multimodal Representations

Oct 08, 2022
Elan Rosenfeld, Preetum Nakkiran, Hadi Pouransari, Oncel Tuzel, Fartash Faghri

Figure 1 for APE: Aligning Pretrained Encoders to Quickly Learn Aligned Multimodal Representations
Figure 2 for APE: Aligning Pretrained Encoders to Quickly Learn Aligned Multimodal Representations
Figure 3 for APE: Aligning Pretrained Encoders to Quickly Learn Aligned Multimodal Representations
Figure 4 for APE: Aligning Pretrained Encoders to Quickly Learn Aligned Multimodal Representations
Viaarxiv icon

The Calibration Generalization Gap

Oct 06, 2022
A. Michael Carrell, Neil Mallinar, James Lucas, Preetum Nakkiran

Figure 1 for The Calibration Generalization Gap
Figure 2 for The Calibration Generalization Gap
Figure 3 for The Calibration Generalization Gap
Figure 4 for The Calibration Generalization Gap
Viaarxiv icon