Alert button
Picture for Denis Kuznedelev

Denis Kuznedelev

Alert button

YaART: Yet Another ART Rendering Technology

Add code
Bookmark button
Alert button
Apr 08, 2024
Sergey Kastryulin, Artem Konev, Alexander Shishenya, Eugene Lyapustin, Artem Khurshudov, Alexander Tselousov, Nikita Vinokurov, Denis Kuznedelev, Alexander Markovich, Grigoriy Livshits, Alexey Kirillov, Anastasiia Tabisheva, Liubov Chubarova, Marina Kaminskaia, Alexander Ustyuzhanin, Artemii Shvetsov, Daniil Shlenskii, Valerii Startsev, Dmitrii Kornilov, Mikhail Romanov, Artem Babenko, Sergei Ovcharenko, Valentin Khrulkov

Viaarxiv icon

Extreme Compression of Large Language Models via Additive Quantization

Add code
Bookmark button
Alert button
Jan 11, 2024
Vage Egiazarian, Andrei Panferov, Denis Kuznedelev, Elias Frantar, Artem Babenko, Dan Alistarh

Viaarxiv icon

Sparse Fine-tuning for Inference Acceleration of Large Language Models

Add code
Bookmark button
Alert button
Oct 13, 2023
Eldar Kurtic, Denis Kuznedelev, Elias Frantar, Michael Goin, Dan Alistarh

Viaarxiv icon

Sparse Finetuning for Inference Acceleration of Large Language Models

Add code
Bookmark button
Alert button
Oct 10, 2023
Eldar Kurtic, Denis Kuznedelev, Elias Frantar, Michael Goin, Dan Alistarh

Viaarxiv icon

Accurate Neural Network Pruning Requires Rethinking Sparse Optimization

Add code
Bookmark button
Alert button
Aug 03, 2023
Denis Kuznedelev, Eldar Kurtic, Eugenia Iofinova, Elias Frantar, Alexandra Peste, Dan Alistarh

Figure 1 for Accurate Neural Network Pruning Requires Rethinking Sparse Optimization
Figure 2 for Accurate Neural Network Pruning Requires Rethinking Sparse Optimization
Figure 3 for Accurate Neural Network Pruning Requires Rethinking Sparse Optimization
Figure 4 for Accurate Neural Network Pruning Requires Rethinking Sparse Optimization
Viaarxiv icon

SpQR: A Sparse-Quantized Representation for Near-Lossless LLM Weight Compression

Add code
Bookmark button
Alert button
Jun 05, 2023
Tim Dettmers, Ruslan Svirschevski, Vage Egiazarian, Denis Kuznedelev, Elias Frantar, Saleh Ashkboos, Alexander Borzunov, Torsten Hoefler, Dan Alistarh

Figure 1 for SpQR: A Sparse-Quantized Representation for Near-Lossless LLM Weight Compression
Figure 2 for SpQR: A Sparse-Quantized Representation for Near-Lossless LLM Weight Compression
Figure 3 for SpQR: A Sparse-Quantized Representation for Near-Lossless LLM Weight Compression
Figure 4 for SpQR: A Sparse-Quantized Representation for Near-Lossless LLM Weight Compression
Viaarxiv icon

Vision Models Can Be Efficiently Specialized via Few-Shot Task-Aware Compression

Add code
Bookmark button
Alert button
Mar 25, 2023
Denis Kuznedelev, Soroush Tabesh, Kimia Noorbakhsh, Elias Frantar, Sara Beery, Eldar Kurtic, Dan Alistarh

Figure 1 for Vision Models Can Be Efficiently Specialized via Few-Shot Task-Aware Compression
Figure 2 for Vision Models Can Be Efficiently Specialized via Few-Shot Task-Aware Compression
Figure 3 for Vision Models Can Be Efficiently Specialized via Few-Shot Task-Aware Compression
Figure 4 for Vision Models Can Be Efficiently Specialized via Few-Shot Task-Aware Compression
Viaarxiv icon

Evaluating Robustness and Uncertainty of Graph Models Under Structural Distributional Shifts

Add code
Bookmark button
Alert button
Feb 27, 2023
Gleb Bazhenov, Denis Kuznedelev, Andrey Malinin, Artem Babenko, Liudmila Prokhorenkova

Figure 1 for Evaluating Robustness and Uncertainty of Graph Models Under Structural Distributional Shifts
Figure 2 for Evaluating Robustness and Uncertainty of Graph Models Under Structural Distributional Shifts
Figure 3 for Evaluating Robustness and Uncertainty of Graph Models Under Structural Distributional Shifts
Figure 4 for Evaluating Robustness and Uncertainty of Graph Models Under Structural Distributional Shifts
Viaarxiv icon

A critical look at the evaluation of GNNs under heterophily: are we really making progress?

Add code
Bookmark button
Alert button
Feb 22, 2023
Oleg Platonov, Denis Kuznedelev, Michael Diskin, Artem Babenko, Liudmila Prokhorenkova

Figure 1 for A critical look at the evaluation of GNNs under heterophily: are we really making progress?
Figure 2 for A critical look at the evaluation of GNNs under heterophily: are we really making progress?
Figure 3 for A critical look at the evaluation of GNNs under heterophily: are we really making progress?
Figure 4 for A critical look at the evaluation of GNNs under heterophily: are we really making progress?
Viaarxiv icon

oViT: An Accurate Second-Order Pruning Framework for Vision Transformers

Add code
Bookmark button
Alert button
Oct 14, 2022
Denis Kuznedelev, Eldar Kurtic, Elias Frantar, Dan Alistarh

Figure 1 for oViT: An Accurate Second-Order Pruning Framework for Vision Transformers
Figure 2 for oViT: An Accurate Second-Order Pruning Framework for Vision Transformers
Figure 3 for oViT: An Accurate Second-Order Pruning Framework for Vision Transformers
Figure 4 for oViT: An Accurate Second-Order Pruning Framework for Vision Transformers
Viaarxiv icon