Alert button
Picture for Michael Diskin

Michael Diskin

Alert button

A critical look at the evaluation of GNNs under heterophily: are we really making progress?

Add code
Bookmark button
Alert button
Feb 22, 2023
Oleg Platonov, Denis Kuznedelev, Michael Diskin, Artem Babenko, Liudmila Prokhorenkova

Figure 1 for A critical look at the evaluation of GNNs under heterophily: are we really making progress?
Figure 2 for A critical look at the evaluation of GNNs under heterophily: are we really making progress?
Figure 3 for A critical look at the evaluation of GNNs under heterophily: are we really making progress?
Figure 4 for A critical look at the evaluation of GNNs under heterophily: are we really making progress?
Viaarxiv icon

SWARM Parallelism: Training Large Models Can Be Surprisingly Communication-Efficient

Add code
Bookmark button
Alert button
Jan 27, 2023
Max Ryabinin, Tim Dettmers, Michael Diskin, Alexander Borzunov

Figure 1 for SWARM Parallelism: Training Large Models Can Be Surprisingly Communication-Efficient
Figure 2 for SWARM Parallelism: Training Large Models Can Be Surprisingly Communication-Efficient
Figure 3 for SWARM Parallelism: Training Large Models Can Be Surprisingly Communication-Efficient
Figure 4 for SWARM Parallelism: Training Large Models Can Be Surprisingly Communication-Efficient
Viaarxiv icon

Training Transformers Together

Add code
Bookmark button
Alert button
Jul 07, 2022
Alexander Borzunov, Max Ryabinin, Tim Dettmers, Quentin Lhoest, Lucile Saulnier, Michael Diskin, Yacine Jernite, Thomas Wolf

Figure 1 for Training Transformers Together
Figure 2 for Training Transformers Together
Viaarxiv icon

Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees

Add code
Bookmark button
Alert button
Oct 07, 2021
Aleksandr Beznosikov, Peter Richtárik, Michael Diskin, Max Ryabinin, Alexander Gasnikov

Figure 1 for Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees
Figure 2 for Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees
Figure 3 for Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees
Figure 4 for Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees
Viaarxiv icon

Secure Distributed Training at Scale

Add code
Bookmark button
Alert button
Jun 21, 2021
Eduard Gorbunov, Alexander Borzunov, Michael Diskin, Max Ryabinin

Figure 1 for Secure Distributed Training at Scale
Figure 2 for Secure Distributed Training at Scale
Figure 3 for Secure Distributed Training at Scale
Figure 4 for Secure Distributed Training at Scale
Viaarxiv icon

Distributed Deep Learning in Open Collaborations

Add code
Bookmark button
Alert button
Jun 18, 2021
Michael Diskin, Alexey Bukhtiyarov, Max Ryabinin, Lucile Saulnier, Quentin Lhoest, Anton Sinitsin, Dmitry Popov, Dmitry Pyrkin, Maxim Kashirin, Alexander Borzunov, Albert Villanova del Moral, Denis Mazur, Ilia Kobelev, Yacine Jernite, Thomas Wolf, Gennady Pekhimenko

Figure 1 for Distributed Deep Learning in Open Collaborations
Figure 2 for Distributed Deep Learning in Open Collaborations
Figure 3 for Distributed Deep Learning in Open Collaborations
Figure 4 for Distributed Deep Learning in Open Collaborations
Viaarxiv icon