Picture for Jun Hu

Jun Hu

NVIDIA, Duke University

Autonomous Chain-of-Thought Distillation for Graph-Based Fraud Detection

Add code
Jan 30, 2026
Viaarxiv icon

EvoClinician: A Self-Evolving Agent for Multi-Turn Medical Diagnosis via Test-Time Evolutionary Learning

Add code
Jan 30, 2026
Viaarxiv icon

Cross-Modal Attention Network with Dual Graph Learning in Multimodal Recommendation

Add code
Jan 16, 2026
Viaarxiv icon

UniBiDex: A Unified Teleoperation Framework for Robotic Bimanual Dexterous Manipulation

Add code
Jan 08, 2026
Viaarxiv icon

Study of Class-Incremental Radio Frequency Fingerprint Recognition Without Storing Exemplars

Add code
Jan 06, 2026
Viaarxiv icon

Echoless Label-Based Pre-computation for Memory-Efficient Heterogeneous Graph Learning

Add code
Nov 14, 2025
Viaarxiv icon

DarkDiff: Advancing Low-Light Raw Enhancement by Retasking Diffusion Models for Camera ISP

Add code
May 29, 2025
Viaarxiv icon

LLaDA 1.5: Variance-Reduced Preference Optimization for Large Language Diffusion Models

Add code
May 25, 2025
Figure 1 for LLaDA 1.5: Variance-Reduced Preference Optimization for Large Language Diffusion Models
Figure 2 for LLaDA 1.5: Variance-Reduced Preference Optimization for Large Language Diffusion Models
Figure 3 for LLaDA 1.5: Variance-Reduced Preference Optimization for Large Language Diffusion Models
Figure 4 for LLaDA 1.5: Variance-Reduced Preference Optimization for Large Language Diffusion Models
Viaarxiv icon

LLaDA-V: Large Language Diffusion Models with Visual Instruction Tuning

Add code
May 22, 2025
Figure 1 for LLaDA-V: Large Language Diffusion Models with Visual Instruction Tuning
Figure 2 for LLaDA-V: Large Language Diffusion Models with Visual Instruction Tuning
Figure 3 for LLaDA-V: Large Language Diffusion Models with Visual Instruction Tuning
Figure 4 for LLaDA-V: Large Language Diffusion Models with Visual Instruction Tuning
Viaarxiv icon

RGL: A Graph-Centric, Modular Framework for Efficient Retrieval-Augmented Generation on Graphs

Add code
Mar 25, 2025
Viaarxiv icon