Diffusion Models


Diffusion models are a class of generative models that learn the probability distribution of data by iteratively applying a series of transformations to a simple base distribution. They have been used in various applications, including image generation, text generation, and density estimation.

Locality in Image Diffusion Models Emerges from Data Statistics

Add code
Sep 11, 2025
Viaarxiv icon

Plug-and-play Diffusion Models for Image Compressive Sensing with Data Consistency Projection

Add code
Sep 11, 2025
Viaarxiv icon

Composable Score-based Graph Diffusion Model for Multi-Conditional Molecular Generation

Add code
Sep 11, 2025
Viaarxiv icon

Improving Video Diffusion Transformer Training by Multi-Feature Fusion and Alignment from Self-Supervised Vision Encoders

Add code
Sep 11, 2025
Viaarxiv icon

Mechanistic Learning with Guided Diffusion Models to Predict Spatio-Temporal Brain Tumor Growth

Add code
Sep 11, 2025
Viaarxiv icon

FlexiD-Fuse: Flexible number of inputs multi-modal medical image fusion based on diffusion model

Add code
Sep 11, 2025
Viaarxiv icon

Prompt Pirates Need a Map: Stealing Seeds helps Stealing Prompts

Add code
Sep 11, 2025
Viaarxiv icon

DiFlow-TTS: Discrete Flow Matching with Factorized Speech Tokens for Low-Latency Zero-Shot Text-To-Speech

Add code
Sep 11, 2025
Viaarxiv icon

Data-driven generative simulation of SDEs using diffusion models

Add code
Sep 10, 2025
Viaarxiv icon

A novel method and dataset for depth-guided image deblurring from smartphone Lidar

Add code
Sep 11, 2025
Viaarxiv icon