Alert button
Picture for Sonali Singh

Sonali Singh

Alert button

GPU Cluster Scheduling for Network-Sensitive Deep Learning

Add code
Bookmark button
Alert button
Jan 29, 2024
Aakash Sharma, Vivek M. Bhasi, Sonali Singh, George Kesidis, Mahmut T. Kandemir, Chita R. Das

Viaarxiv icon

Analysis of Distributed Deep Learning in the Cloud

Add code
Bookmark button
Alert button
Aug 30, 2022
Aakash Sharma, Vivek M. Bhasi, Sonali Singh, Rishabh Jain, Jashwant Raj Gunasekaran, Subrata Mitra, Mahmut Taylan Kandemir, George Kesidis, Chita R. Das

Viaarxiv icon

Machine Learning: Algorithms, Models, and Applications

Add code
Bookmark button
Alert button
Jan 06, 2022
Jaydip Sen, Sidra Mehtab, Rajdeep Sen, Abhishek Dutta, Pooja Kherwa, Saheel Ahmed, Pranay Berry, Sahil Khurana, Sonali Singh, David W. W Cadotte, David W. Anderson, Kalum J. Ost, Racheal S. Akinbo, Oladunni A. Daramola, Bongs Lainjo

Figure 1 for Machine Learning: Algorithms, Models, and Applications
Figure 2 for Machine Learning: Algorithms, Models, and Applications
Figure 3 for Machine Learning: Algorithms, Models, and Applications
Figure 4 for Machine Learning: Algorithms, Models, and Applications
Viaarxiv icon

Exploiting Activation based Gradient Output Sparsity to Accelerate Backpropagation in CNNs

Add code
Bookmark button
Alert button
Sep 16, 2021
Anup Sarma, Sonali Singh, Huaipan Jiang, Ashutosh Pattnaik, Asit K Mishra, Vijaykrishnan Narayanan, Mahmut T Kandemir, Chita R Das

Figure 1 for Exploiting Activation based Gradient Output Sparsity to Accelerate Backpropagation in CNNs
Figure 2 for Exploiting Activation based Gradient Output Sparsity to Accelerate Backpropagation in CNNs
Figure 3 for Exploiting Activation based Gradient Output Sparsity to Accelerate Backpropagation in CNNs
Figure 4 for Exploiting Activation based Gradient Output Sparsity to Accelerate Backpropagation in CNNs
Viaarxiv icon

Structured in Space, Randomized in Time: Leveraging Dropout in RNNs for Efficient Training

Add code
Bookmark button
Alert button
Jun 22, 2021
Anup Sarma, Sonali Singh, Huaipan Jiang, Rui Zhang, Mahmut T Kandemir, Chita R Das

Figure 1 for Structured in Space, Randomized in Time: Leveraging Dropout in RNNs for Efficient Training
Figure 2 for Structured in Space, Randomized in Time: Leveraging Dropout in RNNs for Efficient Training
Figure 3 for Structured in Space, Randomized in Time: Leveraging Dropout in RNNs for Efficient Training
Figure 4 for Structured in Space, Randomized in Time: Leveraging Dropout in RNNs for Efficient Training
Viaarxiv icon