Picture for Linghe Kong

Linghe Kong

2DQuant: Low-bit Post-Training Quantization for Image Super-Resolution

Add code
Jun 10, 2024
Viaarxiv icon

Binarized Diffusion Model for Image Super-Resolution

Add code
Jun 09, 2024
Figure 1 for Binarized Diffusion Model for Image Super-Resolution
Figure 2 for Binarized Diffusion Model for Image Super-Resolution
Figure 3 for Binarized Diffusion Model for Image Super-Resolution
Figure 4 for Binarized Diffusion Model for Image Super-Resolution
Viaarxiv icon

C-Mamba: Channel Correlation Enhanced State Space Models for Multivariate Time Series Forecasting

Add code
Jun 08, 2024
Viaarxiv icon

CondTSF: One-line Plugin of Dataset Condensation for Time Series Forecasting

Add code
Jun 04, 2024
Viaarxiv icon

LoRA-Switch: Boosting the Efficiency of Dynamic LLM Adapters via System-Algorithm Co-design

Add code
May 28, 2024
Viaarxiv icon

Image Super-Resolution with Text Prompt Diffusion

Add code
Nov 24, 2023
Figure 1 for Image Super-Resolution with Text Prompt Diffusion
Figure 2 for Image Super-Resolution with Text Prompt Diffusion
Figure 3 for Image Super-Resolution with Text Prompt Diffusion
Figure 4 for Image Super-Resolution with Text Prompt Diffusion
Viaarxiv icon

Binarized 3D Whole-body Human Mesh Recovery

Add code
Nov 24, 2023
Figure 1 for Binarized 3D Whole-body Human Mesh Recovery
Figure 2 for Binarized 3D Whole-body Human Mesh Recovery
Figure 3 for Binarized 3D Whole-body Human Mesh Recovery
Figure 4 for Binarized 3D Whole-body Human Mesh Recovery
Viaarxiv icon

Natural Language based Context Modeling and Reasoning with LLMs: A Tutorial

Add code
Sep 24, 2023
Viaarxiv icon

Toward Reproducing Network Research Results Using Large Language Models

Add code
Sep 09, 2023
Figure 1 for Toward Reproducing Network Research Results Using Large Language Models
Figure 2 for Toward Reproducing Network Research Results Using Large Language Models
Figure 3 for Toward Reproducing Network Research Results Using Large Language Models
Figure 4 for Toward Reproducing Network Research Results Using Large Language Models
Viaarxiv icon

Serving MoE Models on Resource-constrained Edge Devices via Dynamic Expert Swapping

Add code
Aug 29, 2023
Figure 1 for Serving MoE Models on Resource-constrained Edge Devices via Dynamic Expert Swapping
Figure 2 for Serving MoE Models on Resource-constrained Edge Devices via Dynamic Expert Swapping
Figure 3 for Serving MoE Models on Resource-constrained Edge Devices via Dynamic Expert Swapping
Figure 4 for Serving MoE Models on Resource-constrained Edge Devices via Dynamic Expert Swapping
Viaarxiv icon