Alert button
Picture for Dhruv Choudhary

Dhruv Choudhary

Alert button

Alternate Model Growth and Pruning for Efficient Training of Recommendation Systems

Add code
Bookmark button
Alert button
May 04, 2021
Xiaocong Du, Bhargav Bhushanam, Jiecao Yu, Dhruv Choudhary, Tianxiang Gao, Sherman Wong, Louis Feng, Jongsoo Park, Yu Cao, Arun Kejariwal

Figure 1 for Alternate Model Growth and Pruning for Efficient Training of Recommendation Systems
Figure 2 for Alternate Model Growth and Pruning for Efficient Training of Recommendation Systems
Figure 3 for Alternate Model Growth and Pruning for Efficient Training of Recommendation Systems
Figure 4 for Alternate Model Growth and Pruning for Efficient Training of Recommendation Systems
Viaarxiv icon

Adaptive Dense-to-Sparse Paradigm for Pruning Online Recommendation System with Non-Stationary Data

Add code
Bookmark button
Alert button
Oct 21, 2020
Mao Ye, Dhruv Choudhary, Jiecao Yu, Ellie Wen, Zeliang Chen, Jiyan Yang, Jongsoo Park, Qiang Liu, Arun Kejariwal

Figure 1 for Adaptive Dense-to-Sparse Paradigm for Pruning Online Recommendation System with Non-Stationary Data
Figure 2 for Adaptive Dense-to-Sparse Paradigm for Pruning Online Recommendation System with Non-Stationary Data
Figure 3 for Adaptive Dense-to-Sparse Paradigm for Pruning Online Recommendation System with Non-Stationary Data
Figure 4 for Adaptive Dense-to-Sparse Paradigm for Pruning Online Recommendation System with Non-Stationary Data
Viaarxiv icon

Fast Distributed Training of Deep Neural Networks: Dynamic Communication Thresholding for Model and Data Parallelism

Add code
Bookmark button
Alert button
Oct 18, 2020
Vipul Gupta, Dhruv Choudhary, Ping Tak Peter Tang, Xiaohan Wei, Xing Wang, Yuzhen Huang, Arun Kejariwal, Kannan Ramchandran, Michael W. Mahoney

Figure 1 for Fast Distributed Training of Deep Neural Networks: Dynamic Communication Thresholding for Model and Data Parallelism
Figure 2 for Fast Distributed Training of Deep Neural Networks: Dynamic Communication Thresholding for Model and Data Parallelism
Figure 3 for Fast Distributed Training of Deep Neural Networks: Dynamic Communication Thresholding for Model and Data Parallelism
Figure 4 for Fast Distributed Training of Deep Neural Networks: Dynamic Communication Thresholding for Model and Data Parallelism
Viaarxiv icon

On the Runtime-Efficacy Trade-off of Anomaly Detection Techniques for Real-Time Streaming Data

Add code
Bookmark button
Alert button
Oct 12, 2017
Dhruv Choudhary, Arun Kejariwal, Francois Orsini

Figure 1 for On the Runtime-Efficacy Trade-off of Anomaly Detection Techniques for Real-Time Streaming Data
Figure 2 for On the Runtime-Efficacy Trade-off of Anomaly Detection Techniques for Real-Time Streaming Data
Figure 3 for On the Runtime-Efficacy Trade-off of Anomaly Detection Techniques for Real-Time Streaming Data
Figure 4 for On the Runtime-Efficacy Trade-off of Anomaly Detection Techniques for Real-Time Streaming Data
Viaarxiv icon