Alert button
Picture for Vaibhav Aggarwal

Vaibhav Aggarwal

Alert button

MobileNetV4 -- Universal Models for the Mobile Ecosystem

Add code
Bookmark button
Alert button
Apr 16, 2024
Danfeng Qin, Chas Leichner, Manolis Delakis, Marco Fornoni, Shixin Luo, Fan Yang, Weijun Wang, Colby Banbury, Chengxi Ye, Berkin Akin, Vaibhav Aggarwal, Tenghui Zhu, Daniele Moro, Andrew Howard

Viaarxiv icon

R-MAE: Regions Meet Masked Autoencoders

Add code
Bookmark button
Alert button
Jun 08, 2023
Duy-Kien Nguyen, Vaibhav Aggarwal, Yanghao Li, Martin R. Oswald, Alexander Kirillov, Cees G. M. Snoek, Xinlei Chen

Figure 1 for R-MAE: Regions Meet Masked Autoencoders
Figure 2 for R-MAE: Regions Meet Masked Autoencoders
Figure 3 for R-MAE: Regions Meet Masked Autoencoders
Figure 4 for R-MAE: Regions Meet Masked Autoencoders
Viaarxiv icon

Hiera: A Hierarchical Vision Transformer without the Bells-and-Whistles

Add code
Bookmark button
Alert button
Jun 01, 2023
Chaitanya Ryali, Yuan-Ting Hu, Daniel Bolya, Chen Wei, Haoqi Fan, Po-Yao Huang, Vaibhav Aggarwal, Arkabandhu Chowdhury, Omid Poursaeed, Judy Hoffman, Jitendra Malik, Yanghao Li, Christoph Feichtenhofer

Figure 1 for Hiera: A Hierarchical Vision Transformer without the Bells-and-Whistles
Figure 2 for Hiera: A Hierarchical Vision Transformer without the Bells-and-Whistles
Figure 3 for Hiera: A Hierarchical Vision Transformer without the Bells-and-Whistles
Figure 4 for Hiera: A Hierarchical Vision Transformer without the Bells-and-Whistles
Viaarxiv icon

The effectiveness of MAE pre-pretraining for billion-scale pretraining

Add code
Bookmark button
Alert button
Mar 23, 2023
Mannat Singh, Quentin Duval, Kalyan Vasudev Alwala, Haoqi Fan, Vaibhav Aggarwal, Aaron Adcock, Armand Joulin, Piotr Dollár, Christoph Feichtenhofer, Ross Girshick, Rohit Girdhar, Ishan Misra

Figure 1 for The effectiveness of MAE pre-pretraining for billion-scale pretraining
Figure 2 for The effectiveness of MAE pre-pretraining for billion-scale pretraining
Figure 3 for The effectiveness of MAE pre-pretraining for billion-scale pretraining
Figure 4 for The effectiveness of MAE pre-pretraining for billion-scale pretraining
Viaarxiv icon