Alert button
Picture for Filip Hanzely

Filip Hanzely

Alert button

Federated Learning of a Mixture of Global and Local Models

Add code
Bookmark button
Alert button
Feb 10, 2020
Filip Hanzely, Peter Richtárik

Figure 1 for Federated Learning of a Mixture of Global and Local Models
Figure 2 for Federated Learning of a Mixture of Global and Local Models
Figure 3 for Federated Learning of a Mixture of Global and Local Models
Figure 4 for Federated Learning of a Mixture of Global and Local Models
Viaarxiv icon

Best Pair Formulation & Accelerated Scheme for Non-convex Principal Component Pursuit

Add code
Bookmark button
Alert button
May 28, 2019
Aritra Dutta, Filip Hanzely, Jingwei Liang, Peter Richtárik

Figure 1 for Best Pair Formulation & Accelerated Scheme for Non-convex Principal Component Pursuit
Figure 2 for Best Pair Formulation & Accelerated Scheme for Non-convex Principal Component Pursuit
Figure 3 for Best Pair Formulation & Accelerated Scheme for Non-convex Principal Component Pursuit
Figure 4 for Best Pair Formulation & Accelerated Scheme for Non-convex Principal Component Pursuit
Viaarxiv icon

One Method to Rule Them All: Variance Reduction for Data, Parameters and Many New Methods

Add code
Bookmark button
Alert button
May 27, 2019
Filip Hanzely, Peter Richtárik

Figure 1 for One Method to Rule Them All: Variance Reduction for Data, Parameters and Many New Methods
Figure 2 for One Method to Rule Them All: Variance Reduction for Data, Parameters and Many New Methods
Figure 3 for One Method to Rule Them All: Variance Reduction for Data, Parameters and Many New Methods
Figure 4 for One Method to Rule Them All: Variance Reduction for Data, Parameters and Many New Methods
Viaarxiv icon

A Unified Theory of SGD: Variance Reduction, Sampling, Quantization and Coordinate Descent

Add code
Bookmark button
Alert button
May 27, 2019
Eduard Gorbunov, Filip Hanzely, Peter Richtárik

Figure 1 for A Unified Theory of SGD: Variance Reduction, Sampling, Quantization and Coordinate Descent
Figure 2 for A Unified Theory of SGD: Variance Reduction, Sampling, Quantization and Coordinate Descent
Figure 3 for A Unified Theory of SGD: Variance Reduction, Sampling, Quantization and Coordinate Descent
Figure 4 for A Unified Theory of SGD: Variance Reduction, Sampling, Quantization and Coordinate Descent
Viaarxiv icon

99% of Parallel Optimization is Inevitably a Waste of Time

Add code
Bookmark button
Alert button
Jan 27, 2019
Konstantin Mishchenko, Filip Hanzely, Peter Richtárik

Figure 1 for 99% of Parallel Optimization is Inevitably a Waste of Time
Figure 2 for 99% of Parallel Optimization is Inevitably a Waste of Time
Figure 3 for 99% of Parallel Optimization is Inevitably a Waste of Time
Figure 4 for 99% of Parallel Optimization is Inevitably a Waste of Time
Viaarxiv icon

A Privacy Preserving Randomized Gossip Algorithm via Controlled Noise Insertion

Add code
Bookmark button
Alert button
Jan 27, 2019
Filip Hanzely, Jakub Konečný, Nicolas Loizou, Peter Richtárik, Dmitry Grishchenko

Figure 1 for A Privacy Preserving Randomized Gossip Algorithm via Controlled Noise Insertion
Figure 2 for A Privacy Preserving Randomized Gossip Algorithm via Controlled Noise Insertion
Figure 3 for A Privacy Preserving Randomized Gossip Algorithm via Controlled Noise Insertion
Figure 4 for A Privacy Preserving Randomized Gossip Algorithm via Controlled Noise Insertion
Viaarxiv icon

SEGA: Variance Reduction via Gradient Sketching

Add code
Bookmark button
Alert button
Oct 18, 2018
Filip Hanzely, Konstantin Mishchenko, Peter Richtarik

Figure 1 for SEGA: Variance Reduction via Gradient Sketching
Figure 2 for SEGA: Variance Reduction via Gradient Sketching
Figure 3 for SEGA: Variance Reduction via Gradient Sketching
Figure 4 for SEGA: Variance Reduction via Gradient Sketching
Viaarxiv icon

A Nonconvex Projection Method for Robust PCA

Add code
Bookmark button
Alert button
May 21, 2018
Aritra Dutta, Filip Hanzely, Peter Richtárik

Figure 1 for A Nonconvex Projection Method for Robust PCA
Viaarxiv icon