Alert button
Picture for Chenglin Yang

Chenglin Yang

Alert button

IG Captioner: Information Gain Captioners are Strong Zero-shot Classifiers

Add code
Bookmark button
Alert button
Nov 27, 2023
Chenglin Yang, Siyuan Qiao, Yuan Cao, Yu Zhang, Tao Zhu, Alan Yuille, Jiahui Yu

Figure 1 for IG Captioner: Information Gain Captioners are Strong Zero-shot Classifiers
Figure 2 for IG Captioner: Information Gain Captioners are Strong Zero-shot Classifiers
Figure 3 for IG Captioner: Information Gain Captioners are Strong Zero-shot Classifiers
Figure 4 for IG Captioner: Information Gain Captioners are Strong Zero-shot Classifiers
Viaarxiv icon

MOAT: Alternating Mobile Convolution and Attention Brings Strong Vision Models

Add code
Bookmark button
Alert button
Oct 04, 2022
Chenglin Yang, Siyuan Qiao, Qihang Yu, Xiaoding Yuan, Yukun Zhu, Alan Yuille, Hartwig Adam, Liang-Chieh Chen

Figure 1 for MOAT: Alternating Mobile Convolution and Attention Brings Strong Vision Models
Figure 2 for MOAT: Alternating Mobile Convolution and Attention Brings Strong Vision Models
Figure 3 for MOAT: Alternating Mobile Convolution and Attention Brings Strong Vision Models
Figure 4 for MOAT: Alternating Mobile Convolution and Attention Brings Strong Vision Models
Viaarxiv icon

Lite Vision Transformer with Enhanced Self-Attention

Add code
Bookmark button
Alert button
Dec 20, 2021
Chenglin Yang, Yilin Wang, Jianming Zhang, He Zhang, Zijun Wei, Zhe Lin, Alan Yuille

Figure 1 for Lite Vision Transformer with Enhanced Self-Attention
Figure 2 for Lite Vision Transformer with Enhanced Self-Attention
Figure 3 for Lite Vision Transformer with Enhanced Self-Attention
Figure 4 for Lite Vision Transformer with Enhanced Self-Attention
Viaarxiv icon

Locally Enhanced Self-Attention: Rethinking Self-Attention as Local and Context Terms

Add code
Bookmark button
Alert button
Jul 12, 2021
Chenglin Yang, Siyuan Qiao, Adam Kortylewski, Alan Yuille

Figure 1 for Locally Enhanced Self-Attention: Rethinking Self-Attention as Local and Context Terms
Figure 2 for Locally Enhanced Self-Attention: Rethinking Self-Attention as Local and Context Terms
Figure 3 for Locally Enhanced Self-Attention: Rethinking Self-Attention as Local and Context Terms
Figure 4 for Locally Enhanced Self-Attention: Rethinking Self-Attention as Local and Context Terms
Viaarxiv icon

Meticulous Object Segmentation

Add code
Bookmark button
Alert button
Dec 13, 2020
Chenglin Yang, Yilin Wang, Jianming Zhang, He Zhang, Zhe Lin, Alan Yuille

Viaarxiv icon

Robustness Out of the Box: Compositional Representations Naturally Defend Against Black-Box Patch Attacks

Add code
Bookmark button
Alert button
Dec 01, 2020
Christian Cosgrove, Adam Kortylewski, Chenglin Yang, Alan Yuille

Figure 1 for Robustness Out of the Box: Compositional Representations Naturally Defend Against Black-Box Patch Attacks
Figure 2 for Robustness Out of the Box: Compositional Representations Naturally Defend Against Black-Box Patch Attacks
Figure 3 for Robustness Out of the Box: Compositional Representations Naturally Defend Against Black-Box Patch Attacks
Figure 4 for Robustness Out of the Box: Compositional Representations Naturally Defend Against Black-Box Patch Attacks
Viaarxiv icon

PatchAttack: A Black-box Texture-based Attack with Reinforcement Learning

Add code
Bookmark button
Alert button
Apr 12, 2020
Chenglin Yang, Adam Kortylewski, Cihang Xie, Yinzhi Cao, Alan Yuille

Figure 1 for PatchAttack: A Black-box Texture-based Attack with Reinforcement Learning
Figure 2 for PatchAttack: A Black-box Texture-based Attack with Reinforcement Learning
Figure 3 for PatchAttack: A Black-box Texture-based Attack with Reinforcement Learning
Figure 4 for PatchAttack: A Black-box Texture-based Attack with Reinforcement Learning
Viaarxiv icon

Snapshot Distillation: Teacher-Student Optimization in One Generation

Add code
Bookmark button
Alert button
Dec 01, 2018
Chenglin Yang, Lingxi Xie, Chi Su, Alan L. Yuille

Figure 1 for Snapshot Distillation: Teacher-Student Optimization in One Generation
Figure 2 for Snapshot Distillation: Teacher-Student Optimization in One Generation
Figure 3 for Snapshot Distillation: Teacher-Student Optimization in One Generation
Figure 4 for Snapshot Distillation: Teacher-Student Optimization in One Generation
Viaarxiv icon

Knowledge Distillation in Generations: More Tolerant Teachers Educate Better Students

Add code
Bookmark button
Alert button
Sep 07, 2018
Chenglin Yang, Lingxi Xie, Siyuan Qiao, Alan Yuille

Figure 1 for Knowledge Distillation in Generations: More Tolerant Teachers Educate Better Students
Figure 2 for Knowledge Distillation in Generations: More Tolerant Teachers Educate Better Students
Figure 3 for Knowledge Distillation in Generations: More Tolerant Teachers Educate Better Students
Figure 4 for Knowledge Distillation in Generations: More Tolerant Teachers Educate Better Students
Viaarxiv icon