Picture for Qiang Huang

Qiang Huang

When Abundance Conceals Weakness: Knowledge Conflict in Multilingual Models

Add code
Jan 11, 2026
Viaarxiv icon

Measuring Social Bias in Vision-Language Models with Face-Only Counterfactuals from Real Photos

Add code
Jan 11, 2026
Viaarxiv icon

IDRBench: Interactive Deep Research Benchmark

Add code
Jan 10, 2026
Viaarxiv icon

Feature Space Adaptation for Robust Model Fine-Tuning

Add code
Oct 22, 2025
Figure 1 for Feature Space Adaptation for Robust Model Fine-Tuning
Figure 2 for Feature Space Adaptation for Robust Model Fine-Tuning
Figure 3 for Feature Space Adaptation for Robust Model Fine-Tuning
Figure 4 for Feature Space Adaptation for Robust Model Fine-Tuning
Viaarxiv icon

Explosive Output to Enhance Jumping Ability: A Variable Reduction Ratio Design Paradigm for Humanoid Robots Knee Joint

Add code
Jun 14, 2025
Viaarxiv icon

Text Embeddings Should Capture Implicit Semantics, Not Just Surface Meaning

Add code
Jun 10, 2025
Viaarxiv icon

Don't Reinvent the Wheel: Efficient Instruction-Following Text Embedding based on Guided Space Transformation

Add code
May 30, 2025
Figure 1 for Don't Reinvent the Wheel: Efficient Instruction-Following Text Embedding based on Guided Space Transformation
Figure 2 for Don't Reinvent the Wheel: Efficient Instruction-Following Text Embedding based on Guided Space Transformation
Figure 3 for Don't Reinvent the Wheel: Efficient Instruction-Following Text Embedding based on Guided Space Transformation
Figure 4 for Don't Reinvent the Wheel: Efficient Instruction-Following Text Embedding based on Guided Space Transformation
Viaarxiv icon

PRISM: A Framework for Producing Interpretable Political Bias Embeddings with Political-Aware Cross-Encoder

Add code
May 30, 2025
Viaarxiv icon

A Token is Worth over 1,000 Tokens: Efficient Knowledge Distillation through Low-Rank Clone

Add code
May 19, 2025
Figure 1 for A Token is Worth over 1,000 Tokens: Efficient Knowledge Distillation through Low-Rank Clone
Figure 2 for A Token is Worth over 1,000 Tokens: Efficient Knowledge Distillation through Low-Rank Clone
Figure 3 for A Token is Worth over 1,000 Tokens: Efficient Knowledge Distillation through Low-Rank Clone
Figure 4 for A Token is Worth over 1,000 Tokens: Efficient Knowledge Distillation through Low-Rank Clone
Viaarxiv icon

PathOrchestra: A Comprehensive Foundation Model for Computational Pathology with Over 100 Diverse Clinical-Grade Tasks

Add code
Mar 31, 2025
Viaarxiv icon