Alert button
Picture for Laurent Condat

Laurent Condat

Alert button

FedComLoc: Communication-Efficient Distributed Training of Sparse and Quantized Models

Add code
Bookmark button
Alert button
Mar 14, 2024
Kai Yi, Georg Meinhardt, Laurent Condat, Peter Richtárik

Figure 1 for FedComLoc: Communication-Efficient Distributed Training of Sparse and Quantized Models
Figure 2 for FedComLoc: Communication-Efficient Distributed Training of Sparse and Quantized Models
Figure 3 for FedComLoc: Communication-Efficient Distributed Training of Sparse and Quantized Models
Figure 4 for FedComLoc: Communication-Efficient Distributed Training of Sparse and Quantized Models
Viaarxiv icon

LoCoDL: Communication-Efficient Distributed Learning with Local Training and Compression

Add code
Bookmark button
Alert button
Mar 07, 2024
Laurent Condat, Artavazd Maranjyan, Peter Richtárik

Figure 1 for LoCoDL: Communication-Efficient Distributed Learning with Local Training and Compression
Figure 2 for LoCoDL: Communication-Efficient Distributed Learning with Local Training and Compression
Figure 3 for LoCoDL: Communication-Efficient Distributed Learning with Local Training and Compression
Figure 4 for LoCoDL: Communication-Efficient Distributed Learning with Local Training and Compression
Viaarxiv icon

RandCom: Random Communication Skipping Method for Decentralized Stochastic Optimization

Add code
Bookmark button
Alert button
Oct 12, 2023
Luyao Guo, Sulaiman A. Alghunaim, Kun Yuan, Laurent Condat, Jinde Cao

Viaarxiv icon

Near-Linear Time Projection onto the $\ell_{1,\infty}$ Ball; Application to Sparse Autoencoders

Add code
Bookmark button
Alert button
Jul 19, 2023
Guillaume Perez, Laurent Condat, Michel Barlaud

Figure 1 for Near-Linear Time Projection onto the $\ell_{1,\infty}$ Ball; Application to Sparse Autoencoders
Figure 2 for Near-Linear Time Projection onto the $\ell_{1,\infty}$ Ball; Application to Sparse Autoencoders
Figure 3 for Near-Linear Time Projection onto the $\ell_{1,\infty}$ Ball; Application to Sparse Autoencoders
Figure 4 for Near-Linear Time Projection onto the $\ell_{1,\infty}$ Ball; Application to Sparse Autoencoders
Viaarxiv icon

Explicit Personalization and Local Training: Double Communication Acceleration in Federated Learning

Add code
Bookmark button
Alert button
May 22, 2023
Kai Yi, Laurent Condat, Peter Richtárik

Figure 1 for Explicit Personalization and Local Training: Double Communication Acceleration in Federated Learning
Figure 2 for Explicit Personalization and Local Training: Double Communication Acceleration in Federated Learning
Figure 3 for Explicit Personalization and Local Training: Double Communication Acceleration in Federated Learning
Figure 4 for Explicit Personalization and Local Training: Double Communication Acceleration in Federated Learning
Viaarxiv icon

TAMUNA: Accelerated Federated Learning with Local Training and Partial Participation

Add code
Bookmark button
Alert button
Feb 20, 2023
Laurent Condat, Grigory Malinovsky, Peter Richtárik

Figure 1 for TAMUNA: Accelerated Federated Learning with Local Training and Partial Participation
Figure 2 for TAMUNA: Accelerated Federated Learning with Local Training and Partial Participation
Figure 3 for TAMUNA: Accelerated Federated Learning with Local Training and Partial Participation
Viaarxiv icon

Provably Doubly Accelerated Federated Learning: The First Theoretically Successful Combination of Local Training and Compressed Communication

Add code
Bookmark button
Alert button
Oct 27, 2022
Laurent Condat, Ivan Agarský, Peter Richtárik

Figure 1 for Provably Doubly Accelerated Federated Learning: The First Theoretically Successful Combination of Local Training and Compressed Communication
Figure 2 for Provably Doubly Accelerated Federated Learning: The First Theoretically Successful Combination of Local Training and Compressed Communication
Figure 3 for Provably Doubly Accelerated Federated Learning: The First Theoretically Successful Combination of Local Training and Compressed Communication
Figure 4 for Provably Doubly Accelerated Federated Learning: The First Theoretically Successful Combination of Local Training and Compressed Communication
Viaarxiv icon

Joint Demosaicing and Fusion of Multiresolution Compressed Acquisitions: Image Formation and Reconstruction Methods

Add code
Bookmark button
Alert button
Sep 10, 2022
Daniele Picone, Mauro Dalla Mura, Laurent Condat

Figure 1 for Joint Demosaicing and Fusion of Multiresolution Compressed Acquisitions: Image Formation and Reconstruction Methods
Figure 2 for Joint Demosaicing and Fusion of Multiresolution Compressed Acquisitions: Image Formation and Reconstruction Methods
Figure 3 for Joint Demosaicing and Fusion of Multiresolution Compressed Acquisitions: Image Formation and Reconstruction Methods
Figure 4 for Joint Demosaicing and Fusion of Multiresolution Compressed Acquisitions: Image Formation and Reconstruction Methods
Viaarxiv icon

Tikhonov Regularization of Sphere-Valued Signals

Add code
Bookmark button
Alert button
Jul 25, 2022
Laurent Condat

Viaarxiv icon

EF-BV: A Unified Theory of Error Feedback and Variance Reduction Mechanisms for Biased and Unbiased Compression in Distributed Optimization

Add code
Bookmark button
Alert button
May 09, 2022
Laurent Condat, Kai Yi, Peter Richtárik

Figure 1 for EF-BV: A Unified Theory of Error Feedback and Variance Reduction Mechanisms for Biased and Unbiased Compression in Distributed Optimization
Figure 2 for EF-BV: A Unified Theory of Error Feedback and Variance Reduction Mechanisms for Biased and Unbiased Compression in Distributed Optimization
Figure 3 for EF-BV: A Unified Theory of Error Feedback and Variance Reduction Mechanisms for Biased and Unbiased Compression in Distributed Optimization
Figure 4 for EF-BV: A Unified Theory of Error Feedback and Variance Reduction Mechanisms for Biased and Unbiased Compression in Distributed Optimization
Viaarxiv icon