Picture for Chong Zhang

Chong Zhang

Tony

Diffusion Implicit Policy for Unpaired Scene-aware Motion Synthesis

Add code
Dec 03, 2024
Viaarxiv icon

FedAH: Aggregated Head for Personalized Federated Learning

Add code
Dec 02, 2024
Viaarxiv icon

Target-driven Attack for Large Language Models

Add code
Nov 13, 2024
Viaarxiv icon

GPT-4o System Card

Add code
Oct 25, 2024
Viaarxiv icon

Modeling Layout Reading Order as Ordering Relations for Visually-rich Document Understanding

Add code
Sep 29, 2024
Viaarxiv icon

Emotional Dimension Control in Language Model-Based Text-to-Speech: Spanning a Broad Spectrum of Human Emotions

Add code
Sep 25, 2024
Viaarxiv icon

Uplink Over-the-Air Aggregation for Multi-Model Wireless Federated Learning

Add code
Sep 02, 2024
Viaarxiv icon

UNER: A Unified Prediction Head for Named Entity Recognition in Visually-rich Documents

Add code
Aug 02, 2024
Figure 1 for UNER: A Unified Prediction Head for Named Entity Recognition in Visually-rich Documents
Figure 2 for UNER: A Unified Prediction Head for Named Entity Recognition in Visually-rich Documents
Figure 3 for UNER: A Unified Prediction Head for Named Entity Recognition in Visually-rich Documents
Figure 4 for UNER: A Unified Prediction Head for Named Entity Recognition in Visually-rich Documents
Viaarxiv icon

Multi-task Prompt Words Learning for Social Media Content Generation

Add code
Jul 10, 2024
Figure 1 for Multi-task Prompt Words Learning for Social Media Content Generation
Figure 2 for Multi-task Prompt Words Learning for Social Media Content Generation
Figure 3 for Multi-task Prompt Words Learning for Social Media Content Generation
Figure 4 for Multi-task Prompt Words Learning for Social Media Content Generation
Viaarxiv icon

Skip-Layer Attention: Bridging Abstract and Detailed Dependencies in Transformers

Add code
Jun 17, 2024
Figure 1 for Skip-Layer Attention: Bridging Abstract and Detailed Dependencies in Transformers
Figure 2 for Skip-Layer Attention: Bridging Abstract and Detailed Dependencies in Transformers
Figure 3 for Skip-Layer Attention: Bridging Abstract and Detailed Dependencies in Transformers
Figure 4 for Skip-Layer Attention: Bridging Abstract and Detailed Dependencies in Transformers
Viaarxiv icon