Picture for Hongming Zhang

Hongming Zhang

Shammie

VScan: Rethinking Visual Token Reduction for Efficient Large Vision-Language Models

Add code
May 28, 2025
Viaarxiv icon

InComeS: Integrating Compression and Selection Mechanisms into LLMs for Efficient Model Editing

Add code
May 28, 2025
Viaarxiv icon

WebCoT: Enhancing Web Agent Reasoning by Reconstructing Chain-of-Thought in Reflection, Branching, and Rollback

Add code
May 26, 2025
Viaarxiv icon

Recall with Reasoning: Chain-of-Thought Distillation for Mamba's Long-Context Memory and Extrapolation

Add code
May 06, 2025
Viaarxiv icon

WebEvolver: Enhancing Web Agent Self-Improvement with Coevolving World Model

Add code
Apr 23, 2025
Viaarxiv icon

Enhancing Web Agents with Explicit Rollback Mechanisms

Add code
Apr 16, 2025
Viaarxiv icon

OpenCharacter: Training Customizable Role-Playing LLMs with Large-Scale Synthetic Personas

Add code
Jan 26, 2025
Figure 1 for OpenCharacter: Training Customizable Role-Playing LLMs with Large-Scale Synthetic Personas
Figure 2 for OpenCharacter: Training Customizable Role-Playing LLMs with Large-Scale Synthetic Personas
Figure 3 for OpenCharacter: Training Customizable Role-Playing LLMs with Large-Scale Synthetic Personas
Figure 4 for OpenCharacter: Training Customizable Role-Playing LLMs with Large-Scale Synthetic Personas
Viaarxiv icon

$β$-DQN: Improving Deep Q-Learning By Evolving the Behavior

Add code
Jan 01, 2025
Viaarxiv icon

Attention Entropy is a Key Factor: An Analysis of Parallel Context Encoding with Full-attention-based Pre-trained Language Models

Add code
Dec 21, 2024
Figure 1 for Attention Entropy is a Key Factor: An Analysis of Parallel Context Encoding with Full-attention-based Pre-trained Language Models
Figure 2 for Attention Entropy is a Key Factor: An Analysis of Parallel Context Encoding with Full-attention-based Pre-trained Language Models
Figure 3 for Attention Entropy is a Key Factor: An Analysis of Parallel Context Encoding with Full-attention-based Pre-trained Language Models
Figure 4 for Attention Entropy is a Key Factor: An Analysis of Parallel Context Encoding with Full-attention-based Pre-trained Language Models
Viaarxiv icon

OpenWebVoyager: Building Multimodal Web Agents via Iterative Real-World Exploration, Feedback and Optimization

Add code
Oct 25, 2024
Figure 1 for OpenWebVoyager: Building Multimodal Web Agents via Iterative Real-World Exploration, Feedback and Optimization
Figure 2 for OpenWebVoyager: Building Multimodal Web Agents via Iterative Real-World Exploration, Feedback and Optimization
Figure 3 for OpenWebVoyager: Building Multimodal Web Agents via Iterative Real-World Exploration, Feedback and Optimization
Figure 4 for OpenWebVoyager: Building Multimodal Web Agents via Iterative Real-World Exploration, Feedback and Optimization
Viaarxiv icon