Alert button
Picture for Chong You

Chong You

Alert button

On the Optimization Landscape of Neural Collapse under MSE Loss: Global Optimality with Unconstrained Features

Add code
Bookmark button
Alert button
Mar 02, 2022
Jinxin Zhou, Xiao Li, Tianyu Ding, Chong You, Qing Qu, Zhihui Zhu

Figure 1 for On the Optimization Landscape of Neural Collapse under MSE Loss: Global Optimality with Unconstrained Features
Figure 2 for On the Optimization Landscape of Neural Collapse under MSE Loss: Global Optimality with Unconstrained Features
Figure 3 for On the Optimization Landscape of Neural Collapse under MSE Loss: Global Optimality with Unconstrained Features
Figure 4 for On the Optimization Landscape of Neural Collapse under MSE Loss: Global Optimality with Unconstrained Features
Viaarxiv icon

Robust Training under Label Noise by Over-parameterization

Add code
Bookmark button
Alert button
Feb 28, 2022
Sheng Liu, Zhihui Zhu, Qing Qu, Chong You

Figure 1 for Robust Training under Label Noise by Over-parameterization
Figure 2 for Robust Training under Label Noise by Over-parameterization
Figure 3 for Robust Training under Label Noise by Over-parameterization
Figure 4 for Robust Training under Label Noise by Over-parameterization
Viaarxiv icon

Learning a Self-Expressive Network for Subspace Clustering

Add code
Bookmark button
Alert button
Oct 08, 2021
Shangzhi Zhang, Chong You, René Vidal, Chun-Guang Li

Figure 1 for Learning a Self-Expressive Network for Subspace Clustering
Figure 2 for Learning a Self-Expressive Network for Subspace Clustering
Figure 3 for Learning a Self-Expressive Network for Subspace Clustering
Figure 4 for Learning a Self-Expressive Network for Subspace Clustering
Viaarxiv icon

ReduNet: A White-box Deep Network from the Principle of Maximizing Rate Reduction

Add code
Bookmark button
Alert button
Jun 10, 2021
Kwan Ho Ryan Chan, Yaodong Yu, Chong You, Haozhi Qi, John Wright, Yi Ma

Figure 1 for ReduNet: A White-box Deep Network from the Principle of Maximizing Rate Reduction
Figure 2 for ReduNet: A White-box Deep Network from the Principle of Maximizing Rate Reduction
Figure 3 for ReduNet: A White-box Deep Network from the Principle of Maximizing Rate Reduction
Figure 4 for ReduNet: A White-box Deep Network from the Principle of Maximizing Rate Reduction
Viaarxiv icon

A Geometric Analysis of Neural Collapse with Unconstrained Features

Add code
Bookmark button
Alert button
May 06, 2021
Zhihui Zhu, Tianyu Ding, Jinxin Zhou, Xiao Li, Chong You, Jeremias Sulam, Qing Qu

Figure 1 for A Geometric Analysis of Neural Collapse with Unconstrained Features
Figure 2 for A Geometric Analysis of Neural Collapse with Unconstrained Features
Figure 3 for A Geometric Analysis of Neural Collapse with Unconstrained Features
Figure 4 for A Geometric Analysis of Neural Collapse with Unconstrained Features
Viaarxiv icon

Convolutional Normalization: Improving Deep Convolutional Network Robustness and Training

Add code
Bookmark button
Alert button
Mar 01, 2021
Sheng Liu, Xiao Li, Yuexiang Zhai, Chong You, Zhihui Zhu, Carlos Fernandez-Granda, Qing Qu

Figure 1 for Convolutional Normalization: Improving Deep Convolutional Network Robustness and Training
Figure 2 for Convolutional Normalization: Improving Deep Convolutional Network Robustness and Training
Figure 3 for Convolutional Normalization: Improving Deep Convolutional Network Robustness and Training
Figure 4 for Convolutional Normalization: Improving Deep Convolutional Network Robustness and Training
Viaarxiv icon

Incremental Learning via Rate Reduction

Add code
Bookmark button
Alert button
Nov 30, 2020
Ziyang Wu, Christina Baek, Chong You, Yi Ma

Figure 1 for Incremental Learning via Rate Reduction
Figure 2 for Incremental Learning via Rate Reduction
Figure 3 for Incremental Learning via Rate Reduction
Figure 4 for Incremental Learning via Rate Reduction
Viaarxiv icon

Deep Networks from the Principle of Rate Reduction

Add code
Bookmark button
Alert button
Oct 27, 2020
Kwan Ho Ryan Chan, Yaodong Yu, Chong You, Haozhi Qi, John Wright, Yi Ma

Figure 1 for Deep Networks from the Principle of Rate Reduction
Figure 2 for Deep Networks from the Principle of Rate Reduction
Figure 3 for Deep Networks from the Principle of Rate Reduction
Figure 4 for Deep Networks from the Principle of Rate Reduction
Viaarxiv icon

A Critique of Self-Expressive Deep Subspace Clustering

Add code
Bookmark button
Alert button
Oct 08, 2020
Benjamin D. Haeffele, Chong You, René Vidal

Figure 1 for A Critique of Self-Expressive Deep Subspace Clustering
Figure 2 for A Critique of Self-Expressive Deep Subspace Clustering
Figure 3 for A Critique of Self-Expressive Deep Subspace Clustering
Figure 4 for A Critique of Self-Expressive Deep Subspace Clustering
Viaarxiv icon