Picture for Huaming Chen

Huaming Chen

FedDifRC: Unlocking the Potential of Text-to-Image Diffusion Models in Heterogeneous Federated Learning

Add code
Jul 09, 2025
Viaarxiv icon

FedSC: Federated Learning with Semantic-Aware Collaboration

Add code
Jun 26, 2025
Viaarxiv icon

FedSKC: Federated Learning with Non-IID Data via Structural Knowledge Collaboration

Add code
May 25, 2025
Viaarxiv icon

The Tower of Babel Revisited: Multilingual Jailbreak Prompts on Closed-Source Large Language Models

Add code
May 18, 2025
Viaarxiv icon

From Compliance to Exploitation: Jailbreak Prompt Attacks on Multimodal LLMs

Add code
Feb 02, 2025
Viaarxiv icon

Towards Advancing Code Generation with Large Language Models: A Research Roadmap

Add code
Jan 20, 2025
Viaarxiv icon

Attribution for Enhanced Explanation with Transferable Adversarial eXploration

Add code
Dec 27, 2024
Figure 1 for Attribution for Enhanced Explanation with Transferable Adversarial eXploration
Figure 2 for Attribution for Enhanced Explanation with Transferable Adversarial eXploration
Figure 3 for Attribution for Enhanced Explanation with Transferable Adversarial eXploration
Figure 4 for Attribution for Enhanced Explanation with Transferable Adversarial eXploration
Viaarxiv icon

What You See Is Not Always What You Get: An Empirical Study of Code Comprehension by Large Language Models

Add code
Dec 11, 2024
Viaarxiv icon

AI-Compass: A Comprehensive and Effective Multi-module Testing Tool for AI Systems

Add code
Nov 09, 2024
Figure 1 for AI-Compass: A Comprehensive and Effective Multi-module Testing Tool for AI Systems
Figure 2 for AI-Compass: A Comprehensive and Effective Multi-module Testing Tool for AI Systems
Figure 3 for AI-Compass: A Comprehensive and Effective Multi-module Testing Tool for AI Systems
Figure 4 for AI-Compass: A Comprehensive and Effective Multi-module Testing Tool for AI Systems
Viaarxiv icon

CAKD: A Correlation-Aware Knowledge Distillation Framework Based on Decoupling Kullback-Leibler Divergence

Add code
Oct 17, 2024
Figure 1 for CAKD: A Correlation-Aware Knowledge Distillation Framework Based on Decoupling Kullback-Leibler Divergence
Figure 2 for CAKD: A Correlation-Aware Knowledge Distillation Framework Based on Decoupling Kullback-Leibler Divergence
Figure 3 for CAKD: A Correlation-Aware Knowledge Distillation Framework Based on Decoupling Kullback-Leibler Divergence
Figure 4 for CAKD: A Correlation-Aware Knowledge Distillation Framework Based on Decoupling Kullback-Leibler Divergence
Viaarxiv icon