Picture for Mehdi Cherti

Mehdi Cherti

A Good CREPE needs more than just Sugar: Investigating Biases in Compositional Vision-Language Benchmarks

Add code
Jun 09, 2025
Viaarxiv icon

Scaling Laws for Robust Comparison of Open Foundation Language-Vision Models and Datasets

Add code
Jun 05, 2025
Viaarxiv icon

A Practitioner's Guide to Continual Multimodal Pretraining

Add code
Aug 26, 2024
Viaarxiv icon

Inverse Deep Learning Ray Tracing for Heliostat Surface Prediction

Add code
Aug 20, 2024
Viaarxiv icon

Alice in Wonderland: Simple Tasks Showing Complete Reasoning Breakdown in State-Of-the-Art Large Language Models

Add code
Jun 04, 2024
Figure 1 for Alice in Wonderland: Simple Tasks Showing Complete Reasoning Breakdown in State-Of-the-Art Large Language Models
Figure 2 for Alice in Wonderland: Simple Tasks Showing Complete Reasoning Breakdown in State-Of-the-Art Large Language Models
Figure 3 for Alice in Wonderland: Simple Tasks Showing Complete Reasoning Breakdown in State-Of-the-Art Large Language Models
Figure 4 for Alice in Wonderland: Simple Tasks Showing Complete Reasoning Breakdown in State-Of-the-Art Large Language Models
Viaarxiv icon

DataComp: In search of the next generation of multimodal datasets

Add code
May 03, 2023
Viaarxiv icon

A Comparative Study on Generative Models for High Resolution Solar Observation Imaging

Add code
Apr 14, 2023
Viaarxiv icon

Reproducible scaling laws for contrastive language-image learning

Add code
Dec 14, 2022
Viaarxiv icon

LAION-5B: An open large-scale dataset for training next generation image-text models

Add code
Oct 16, 2022
Figure 1 for LAION-5B: An open large-scale dataset for training next generation image-text models
Figure 2 for LAION-5B: An open large-scale dataset for training next generation image-text models
Figure 3 for LAION-5B: An open large-scale dataset for training next generation image-text models
Figure 4 for LAION-5B: An open large-scale dataset for training next generation image-text models
Viaarxiv icon

Effect of large-scale pre-training on full and few-shot transfer learning for natural and medical images

Add code
Jun 09, 2021
Figure 1 for Effect of large-scale pre-training on full and few-shot transfer learning for natural and medical images
Figure 2 for Effect of large-scale pre-training on full and few-shot transfer learning for natural and medical images
Figure 3 for Effect of large-scale pre-training on full and few-shot transfer learning for natural and medical images
Figure 4 for Effect of large-scale pre-training on full and few-shot transfer learning for natural and medical images
Viaarxiv icon