Picture for Dinh Phung

Dinh Phung

Planning for Success: Exploring LLM Long-term Planning Capabilities in Table Understanding

Add code
Aug 23, 2025
Viaarxiv icon

Improving Table Understanding with LLMs and Entity-Oriented Search

Add code
Aug 23, 2025
Viaarxiv icon

Preserving Clusters in Prompt Learning for Unsupervised Domain Adaptation

Add code
Jun 13, 2025
Viaarxiv icon

Promoting Ensemble Diversity with Interactive Bayesian Distributional Robustness for Fine-tuning Foundation Models

Add code
Jun 08, 2025
Viaarxiv icon

Optimizing Specific and Shared Parameters for Efficient Parameter Tuning

Add code
Apr 04, 2025
Viaarxiv icon

Unbiased Sliced Wasserstein Kernels for High-Quality Audio Captioning

Add code
Feb 08, 2025
Figure 1 for Unbiased Sliced Wasserstein Kernels for High-Quality Audio Captioning
Figure 2 for Unbiased Sliced Wasserstein Kernels for High-Quality Audio Captioning
Figure 3 for Unbiased Sliced Wasserstein Kernels for High-Quality Audio Captioning
Figure 4 for Unbiased Sliced Wasserstein Kernels for High-Quality Audio Captioning
Viaarxiv icon

GFM-RAG: Graph Foundation Model for Retrieval Augmented Generation

Add code
Feb 03, 2025
Figure 1 for GFM-RAG: Graph Foundation Model for Retrieval Augmented Generation
Figure 2 for GFM-RAG: Graph Foundation Model for Retrieval Augmented Generation
Figure 3 for GFM-RAG: Graph Foundation Model for Retrieval Augmented Generation
Figure 4 for GFM-RAG: Graph Foundation Model for Retrieval Augmented Generation
Viaarxiv icon

Fantastic Targets for Concept Erasure in Diffusion Models and Where To Find Them

Add code
Jan 31, 2025
Viaarxiv icon

PanSplat: 4K Panorama Synthesis with Feed-Forward Gaussian Splatting

Add code
Dec 16, 2024
Viaarxiv icon

Large-Scale Data-Free Knowledge Distillation for ImageNet via Multi-Resolution Data Generation

Add code
Nov 26, 2024
Figure 1 for Large-Scale Data-Free Knowledge Distillation for ImageNet via Multi-Resolution Data Generation
Figure 2 for Large-Scale Data-Free Knowledge Distillation for ImageNet via Multi-Resolution Data Generation
Figure 3 for Large-Scale Data-Free Knowledge Distillation for ImageNet via Multi-Resolution Data Generation
Figure 4 for Large-Scale Data-Free Knowledge Distillation for ImageNet via Multi-Resolution Data Generation
Viaarxiv icon