Alert button
Picture for Ziliang Zong

Ziliang Zong

Alert button

Reduce, Reuse, Recycle: Improving Training Efficiency with Distillation

Add code
Bookmark button
Alert button
Nov 01, 2022
Cody Blakeney, Jessica Zosa Forde, Jonathan Frankle, Ziliang Zong, Matthew L. Leavitt

Figure 1 for Reduce, Reuse, Recycle: Improving Training Efficiency with Distillation
Figure 2 for Reduce, Reuse, Recycle: Improving Training Efficiency with Distillation
Figure 3 for Reduce, Reuse, Recycle: Improving Training Efficiency with Distillation
Figure 4 for Reduce, Reuse, Recycle: Improving Training Efficiency with Distillation
Viaarxiv icon

Learning Omnidirectional Flow in 360-degree Video via Siamese Representation

Add code
Bookmark button
Alert button
Aug 07, 2022
Keshav Bhandari, Bin Duan, Gaowen Liu, Hugo Latapie, Ziliang Zong, Yan Yan

Figure 1 for Learning Omnidirectional Flow in 360-degree Video via Siamese Representation
Figure 2 for Learning Omnidirectional Flow in 360-degree Video via Siamese Representation
Figure 3 for Learning Omnidirectional Flow in 360-degree Video via Siamese Representation
Figure 4 for Learning Omnidirectional Flow in 360-degree Video via Siamese Representation
Viaarxiv icon

Network Binarization via Contrastive Learning

Add code
Bookmark button
Alert button
Jul 16, 2022
Yuzhang Shang, Dan Xu, Ziliang Zong, Liqiang Nie, Yan Yan

Figure 1 for Network Binarization via Contrastive Learning
Figure 2 for Network Binarization via Contrastive Learning
Figure 3 for Network Binarization via Contrastive Learning
Figure 4 for Network Binarization via Contrastive Learning
Viaarxiv icon

Lipschitz Continuity Retained Binary Neural Network

Add code
Bookmark button
Alert button
Jul 16, 2022
Yuzhang Shang, Dan Xu, Bin Duan, Ziliang Zong, Liqiang Nie, Yan Yan

Figure 1 for Lipschitz Continuity Retained Binary Neural Network
Figure 2 for Lipschitz Continuity Retained Binary Neural Network
Figure 3 for Lipschitz Continuity Retained Binary Neural Network
Figure 4 for Lipschitz Continuity Retained Binary Neural Network
Viaarxiv icon

Win the Lottery Ticket via Fourier Analysis: Frequencies Guided Network Pruning

Add code
Bookmark button
Alert button
Jan 30, 2022
Yuzhang Shang, Bin Duan, Ziliang Zong, Liqiang Nie, Yan Yan

Figure 1 for Win the Lottery Ticket via Fourier Analysis: Frequencies Guided Network Pruning
Figure 2 for Win the Lottery Ticket via Fourier Analysis: Frequencies Guided Network Pruning
Figure 3 for Win the Lottery Ticket via Fourier Analysis: Frequencies Guided Network Pruning
Figure 4 for Win the Lottery Ticket via Fourier Analysis: Frequencies Guided Network Pruning
Viaarxiv icon

Measure Twice, Cut Once: Quantifying Bias and Fairness in Deep Neural Networks

Add code
Bookmark button
Alert button
Oct 08, 2021
Cody Blakeney, Gentry Atkinson, Nathaniel Huish, Yan Yan, Vangelis Metris, Ziliang Zong

Figure 1 for Measure Twice, Cut Once: Quantifying Bias and Fairness in Deep Neural Networks
Figure 2 for Measure Twice, Cut Once: Quantifying Bias and Fairness in Deep Neural Networks
Figure 3 for Measure Twice, Cut Once: Quantifying Bias and Fairness in Deep Neural Networks
Figure 4 for Measure Twice, Cut Once: Quantifying Bias and Fairness in Deep Neural Networks
Viaarxiv icon

Lipschitz Continuity Guided Knowledge Distillation

Add code
Bookmark button
Alert button
Aug 29, 2021
Yuzhang Shang, Bin Duan, Ziliang Zong, Liqiang Nie, Yan Yan

Figure 1 for Lipschitz Continuity Guided Knowledge Distillation
Figure 2 for Lipschitz Continuity Guided Knowledge Distillation
Figure 3 for Lipschitz Continuity Guided Knowledge Distillation
Figure 4 for Lipschitz Continuity Guided Knowledge Distillation
Viaarxiv icon

Simon Says: Evaluating and Mitigating Bias in Pruned Neural Networks with Knowledge Distillation

Add code
Bookmark button
Alert button
Jun 15, 2021
Cody Blakeney, Nathaniel Huish, Yan Yan, Ziliang Zong

Figure 1 for Simon Says: Evaluating and Mitigating Bias in Pruned Neural Networks with Knowledge Distillation
Figure 2 for Simon Says: Evaluating and Mitigating Bias in Pruned Neural Networks with Knowledge Distillation
Figure 3 for Simon Says: Evaluating and Mitigating Bias in Pruned Neural Networks with Knowledge Distillation
Figure 4 for Simon Says: Evaluating and Mitigating Bias in Pruned Neural Networks with Knowledge Distillation
Viaarxiv icon

Parallel Blockwise Knowledge Distillation for Deep Neural Network Compression

Add code
Bookmark button
Alert button
Dec 05, 2020
Cody Blakeney, Xiaomin Li, Yan Yan, Ziliang Zong

Figure 1 for Parallel Blockwise Knowledge Distillation for Deep Neural Network Compression
Figure 2 for Parallel Blockwise Knowledge Distillation for Deep Neural Network Compression
Figure 3 for Parallel Blockwise Knowledge Distillation for Deep Neural Network Compression
Figure 4 for Parallel Blockwise Knowledge Distillation for Deep Neural Network Compression
Viaarxiv icon