Alert button
Picture for Anastasios Kyrillidis

Anastasios Kyrillidis

Alert button

i-SpaSP: Structured Neural Pruning via Sparse Signal Recovery

Dec 07, 2021
Cameron R. Wolfe, Anastasios Kyrillidis

Figure 1 for i-SpaSP: Structured Neural Pruning via Sparse Signal Recovery
Figure 2 for i-SpaSP: Structured Neural Pruning via Sparse Signal Recovery
Figure 3 for i-SpaSP: Structured Neural Pruning via Sparse Signal Recovery
Figure 4 for i-SpaSP: Structured Neural Pruning via Sparse Signal Recovery
Viaarxiv icon

On the Convergence of Shallow Neural Network Training with Randomly Masked Neurons

Dec 05, 2021
Fangshuo Liao, Anastasios Kyrillidis

Figure 1 for On the Convergence of Shallow Neural Network Training with Randomly Masked Neurons
Figure 2 for On the Convergence of Shallow Neural Network Training with Randomly Masked Neurons
Figure 3 for On the Convergence of Shallow Neural Network Training with Randomly Masked Neurons
Viaarxiv icon

Convergence and Stability of the Stochastic Proximal Point Algorithm with Momentum

Dec 03, 2021
Junhyung Lyle Kim, Panos Toulis, Anastasios Kyrillidis

Figure 1 for Convergence and Stability of the Stochastic Proximal Point Algorithm with Momentum
Figure 2 for Convergence and Stability of the Stochastic Proximal Point Algorithm with Momentum
Figure 3 for Convergence and Stability of the Stochastic Proximal Point Algorithm with Momentum
Figure 4 for Convergence and Stability of the Stochastic Proximal Point Algorithm with Momentum
Viaarxiv icon

Federated Multiple Label Hashing (FedMLH): Communication Efficient Federated Learning on Extreme Classification Tasks

Oct 23, 2021
Zhenwei Dai, Chen Dun, Yuxin Tang, Anastasios Kyrillidis, Anshumali Shrivastava

Figure 1 for Federated Multiple Label Hashing (FedMLH): Communication Efficient Federated Learning on Extreme Classification Tasks
Figure 2 for Federated Multiple Label Hashing (FedMLH): Communication Efficient Federated Learning on Extreme Classification Tasks
Figure 3 for Federated Multiple Label Hashing (FedMLH): Communication Efficient Federated Learning on Extreme Classification Tasks
Figure 4 for Federated Multiple Label Hashing (FedMLH): Communication Efficient Federated Learning on Extreme Classification Tasks
Viaarxiv icon

Provably Efficient Lottery Ticket Discovery

Jul 31, 2021
Cameron R. Wolfe, Qihan Wang, Junhyung Lyle Kim, Anastasios Kyrillidis

Figure 1 for Provably Efficient Lottery Ticket Discovery
Figure 2 for Provably Efficient Lottery Ticket Discovery
Figure 3 for Provably Efficient Lottery Ticket Discovery
Figure 4 for Provably Efficient Lottery Ticket Discovery
Viaarxiv icon

REX: Revisiting Budgeted Training with an Improved Schedule

Jul 09, 2021
John Chen, Cameron Wolfe, Anastasios Kyrillidis

Figure 1 for REX: Revisiting Budgeted Training with an Improved Schedule
Figure 2 for REX: Revisiting Budgeted Training with an Improved Schedule
Figure 3 for REX: Revisiting Budgeted Training with an Improved Schedule
Figure 4 for REX: Revisiting Budgeted Training with an Improved Schedule
Viaarxiv icon

Momentum-inspired Low-Rank Coordinate Descent for Diagonally Constrained SDPs

Jul 03, 2021
Junhyung Lyle Kim, Jose Antonio Lara Benitez, Mohammad Taha Toghani, Cameron Wolfe, Zhiwei Zhang, Anastasios Kyrillidis

Figure 1 for Momentum-inspired Low-Rank Coordinate Descent for Diagonally Constrained SDPs
Figure 2 for Momentum-inspired Low-Rank Coordinate Descent for Diagonally Constrained SDPs
Figure 3 for Momentum-inspired Low-Rank Coordinate Descent for Diagonally Constrained SDPs
Figure 4 for Momentum-inspired Low-Rank Coordinate Descent for Diagonally Constrained SDPs
Viaarxiv icon

ResIST: Layer-Wise Decomposition of ResNets for Distributed Training

Jul 02, 2021
Chen Dun, Cameron R. Wolfe, Christopher M. Jermaine, Anastasios Kyrillidis

Figure 1 for ResIST: Layer-Wise Decomposition of ResNets for Distributed Training
Figure 2 for ResIST: Layer-Wise Decomposition of ResNets for Distributed Training
Figure 3 for ResIST: Layer-Wise Decomposition of ResNets for Distributed Training
Figure 4 for ResIST: Layer-Wise Decomposition of ResNets for Distributed Training
Viaarxiv icon

Mitigating deep double descent by concatenating inputs

Jul 02, 2021
John Chen, Qihan Wang, Anastasios Kyrillidis

Figure 1 for Mitigating deep double descent by concatenating inputs
Figure 2 for Mitigating deep double descent by concatenating inputs
Figure 3 for Mitigating deep double descent by concatenating inputs
Figure 4 for Mitigating deep double descent by concatenating inputs
Viaarxiv icon