Alert button
Picture for Zhuoran Shen

Zhuoran Shen

Alert button

Simple Open-Vocabulary Object Detection with Vision Transformers

Add code
Bookmark button
Alert button
May 12, 2022
Matthias Minderer, Alexey Gritsenko, Austin Stone, Maxim Neumann, Dirk Weissenborn, Alexey Dosovitskiy, Aravindh Mahendran, Anurag Arnab, Mostafa Dehghani, Zhuoran Shen, Xiao Wang, Xiaohua Zhai, Thomas Kipf, Neil Houlsby

Figure 1 for Simple Open-Vocabulary Object Detection with Vision Transformers
Figure 2 for Simple Open-Vocabulary Object Detection with Vision Transformers
Figure 3 for Simple Open-Vocabulary Object Detection with Vision Transformers
Figure 4 for Simple Open-Vocabulary Object Detection with Vision Transformers
Viaarxiv icon

Global Self-Attention Networks for Image Recognition

Add code
Bookmark button
Alert button
Oct 14, 2020
Zhuoran Shen, Irwan Bello, Raviteja Vemulapalli, Xuhui Jia, Ching-Hui Chen

Figure 1 for Global Self-Attention Networks for Image Recognition
Figure 2 for Global Self-Attention Networks for Image Recognition
Figure 3 for Global Self-Attention Networks for Image Recognition
Figure 4 for Global Self-Attention Networks for Image Recognition
Viaarxiv icon

Fast Video Object Segmentation using the Global Context Module

Add code
Bookmark button
Alert button
Jan 30, 2020
Yu Li, Zhuoran Shen, Ying Shan

Figure 1 for Fast Video Object Segmentation using the Global Context Module
Figure 2 for Fast Video Object Segmentation using the Global Context Module
Figure 3 for Fast Video Object Segmentation using the Global Context Module
Figure 4 for Fast Video Object Segmentation using the Global Context Module
Viaarxiv icon

Factorized Attention: Self-Attention with Linear Complexities

Add code
Bookmark button
Alert button
Dec 04, 2018
Zhuoran Shen, Mingyuan Zhang, Shuai Yi, Junjie Yan, Haiyu Zhao

Figure 1 for Factorized Attention: Self-Attention with Linear Complexities
Figure 2 for Factorized Attention: Self-Attention with Linear Complexities
Figure 3 for Factorized Attention: Self-Attention with Linear Complexities
Figure 4 for Factorized Attention: Self-Attention with Linear Complexities
Viaarxiv icon