Picture for Xinyang Zhang

Xinyang Zhang

T$^2$PO: Uncertainty-Guided Exploration Control for Stable Multi-Turn Agentic Reinforcement Learning

Add code
May 04, 2026
Viaarxiv icon

Autoregressive Image Generation with Masked Bit Modeling

Add code
Feb 09, 2026
Viaarxiv icon

From Web Search towards Agentic Deep Research: Incentivizing Search with Reasoning Agents

Add code
Jun 23, 2025
Viaarxiv icon

PersonaAgent: When Large Language Model Agents Meet Personalization at Test Time

Add code
Jun 06, 2025
Viaarxiv icon

END: Early Noise Dropping for Efficient and Effective Context Denoising

Add code
Feb 26, 2025
Viaarxiv icon

Prompt-Guided Mask Proposal for Two-Stage Open-Vocabulary Segmentation

Add code
Dec 13, 2024
Figure 1 for Prompt-Guided Mask Proposal for Two-Stage Open-Vocabulary Segmentation
Figure 2 for Prompt-Guided Mask Proposal for Two-Stage Open-Vocabulary Segmentation
Figure 3 for Prompt-Guided Mask Proposal for Two-Stage Open-Vocabulary Segmentation
Figure 4 for Prompt-Guided Mask Proposal for Two-Stage Open-Vocabulary Segmentation
Viaarxiv icon

Text2Layer: Layered Image Generation using Latent Diffusion Model

Add code
Jul 19, 2023
Viaarxiv icon

Patton: Language Model Pretraining on Text-Rich Networks

Add code
May 20, 2023
Viaarxiv icon

Federated Learning with Client-Exclusive Classes

Add code
Jan 01, 2023
Figure 1 for Federated Learning with Client-Exclusive Classes
Figure 2 for Federated Learning with Client-Exclusive Classes
Figure 3 for Federated Learning with Client-Exclusive Classes
Figure 4 for Federated Learning with Client-Exclusive Classes
Viaarxiv icon

TwHIN-BERT: A Socially-Enriched Pre-trained Language Model for Multilingual Tweet Representations

Add code
Sep 15, 2022
Figure 1 for TwHIN-BERT: A Socially-Enriched Pre-trained Language Model for Multilingual Tweet Representations
Figure 2 for TwHIN-BERT: A Socially-Enriched Pre-trained Language Model for Multilingual Tweet Representations
Figure 3 for TwHIN-BERT: A Socially-Enriched Pre-trained Language Model for Multilingual Tweet Representations
Figure 4 for TwHIN-BERT: A Socially-Enriched Pre-trained Language Model for Multilingual Tweet Representations
Viaarxiv icon