Alert button
Picture for John Tan Chong Min

John Tan Chong Min

Alert button

Optimizing Learning Rate Schedules for Iterative Pruning of Deep Neural Networks

Add code
Bookmark button
Alert button
Dec 09, 2022
Shiyu Liu, Rohan Ghosh, John Tan Chong Min, Mehul Motani

Figure 1 for Optimizing Learning Rate Schedules for Iterative Pruning of Deep Neural Networks
Figure 2 for Optimizing Learning Rate Schedules for Iterative Pruning of Deep Neural Networks
Figure 3 for Optimizing Learning Rate Schedules for Iterative Pruning of Deep Neural Networks
Figure 4 for Optimizing Learning Rate Schedules for Iterative Pruning of Deep Neural Networks
Viaarxiv icon

DropNet: Reducing Neural Network Complexity via Iterative Pruning

Add code
Bookmark button
Alert button
Jul 14, 2022
John Tan Chong Min, Mehul Motani

Figure 1 for DropNet: Reducing Neural Network Complexity via Iterative Pruning
Figure 2 for DropNet: Reducing Neural Network Complexity via Iterative Pruning
Figure 3 for DropNet: Reducing Neural Network Complexity via Iterative Pruning
Figure 4 for DropNet: Reducing Neural Network Complexity via Iterative Pruning
Viaarxiv icon

Brick Tic-Tac-Toe: Exploring the Generalizability of AlphaZero to Novel Test Environments

Add code
Bookmark button
Alert button
Jul 14, 2022
John Tan Chong Min, Mehul Motani

Figure 1 for Brick Tic-Tac-Toe: Exploring the Generalizability of AlphaZero to Novel Test Environments
Figure 2 for Brick Tic-Tac-Toe: Exploring the Generalizability of AlphaZero to Novel Test Environments
Figure 3 for Brick Tic-Tac-Toe: Exploring the Generalizability of AlphaZero to Novel Test Environments
Figure 4 for Brick Tic-Tac-Toe: Exploring the Generalizability of AlphaZero to Novel Test Environments
Viaarxiv icon