Picture for Linh Ngo Van

Linh Ngo Van

TokenRatio: Principled Token-Level Preference Optimization via Ratio Matching

Add code
May 14, 2026
Viaarxiv icon

Selective Off-Policy Reference Tuning with Plan Guidance

Add code
May 13, 2026
Viaarxiv icon

LLM-XTM: Enhancing Cross-Lingual Topic Models with Large Language Models

Add code
May 05, 2026
Viaarxiv icon

MIPIC: Matryoshka Representation Learning via Self-Distilled Intra-Relational and Progressive Information Chaining

Add code
Apr 27, 2026
Viaarxiv icon

DWA-KD: Dual-Space Weighting and Time-Warped Alignment for Cross-Tokenizer Knowledge Distillation

Add code
Feb 25, 2026
Viaarxiv icon

CTPD: Cross Tokenizer Preference Distillation

Add code
Jan 17, 2026
Viaarxiv icon

Hierarchical Neural Collapse Detection Transformer for Class Incremental Object Detection

Add code
Jun 10, 2025
Viaarxiv icon

Towards Rehearsal-Free Continual Relation Extraction: Capturing Within-Task Variance with Adaptive Prompting

Add code
May 20, 2025
Viaarxiv icon

Few-Shot, No Problem: Descriptive Continual Relation Extraction

Add code
Feb 27, 2025
Viaarxiv icon

CoT2Align: Cross-Chain of Thought Distillation via Optimal Transport Alignment for Language Models with Different Tokenizers

Add code
Feb 25, 2025
Figure 1 for CoT2Align: Cross-Chain of Thought Distillation via Optimal Transport Alignment for Language Models with Different Tokenizers
Figure 2 for CoT2Align: Cross-Chain of Thought Distillation via Optimal Transport Alignment for Language Models with Different Tokenizers
Figure 3 for CoT2Align: Cross-Chain of Thought Distillation via Optimal Transport Alignment for Language Models with Different Tokenizers
Figure 4 for CoT2Align: Cross-Chain of Thought Distillation via Optimal Transport Alignment for Language Models with Different Tokenizers
Viaarxiv icon