Picture for Zonglin Li

Zonglin Li

Generalizable and Relightable Gaussian Splatting for Human Novel View Synthesis

Add code
May 27, 2025
Viaarxiv icon

Shape-Guided Clothing Warping for Virtual Try-On

Add code
Apr 21, 2025
Viaarxiv icon

Progressive Limb-Aware Virtual Try-On

Add code
Mar 16, 2025
Viaarxiv icon

Path-Adaptive Matting for Efficient Inference Under Various Computational Cost Constraints

Add code
Mar 05, 2025
Figure 1 for Path-Adaptive Matting for Efficient Inference Under Various Computational Cost Constraints
Figure 2 for Path-Adaptive Matting for Efficient Inference Under Various Computational Cost Constraints
Figure 3 for Path-Adaptive Matting for Efficient Inference Under Various Computational Cost Constraints
Figure 4 for Path-Adaptive Matting for Efficient Inference Under Various Computational Cost Constraints
Viaarxiv icon

Continuous Approximations for Improving Quantization Aware Training of LLMs

Add code
Oct 06, 2024
Viaarxiv icon

Gemini: A Family of Highly Capable Multimodal Models

Add code
Dec 19, 2023
Viaarxiv icon

ReST meets ReAct: Self-Improvement for Multi-Step Reasoning LLM Agent

Add code
Dec 15, 2023
Viaarxiv icon

ResMem: Learn what you can and memorize the rest

Add code
Feb 03, 2023
Figure 1 for ResMem: Learn what you can and memorize the rest
Figure 2 for ResMem: Learn what you can and memorize the rest
Figure 3 for ResMem: Learn what you can and memorize the rest
Figure 4 for ResMem: Learn what you can and memorize the rest
Viaarxiv icon

Large Models are Parsimonious Learners: Activation Sparsity in Trained Transformers

Add code
Oct 12, 2022
Figure 1 for Large Models are Parsimonious Learners: Activation Sparsity in Trained Transformers
Figure 2 for Large Models are Parsimonious Learners: Activation Sparsity in Trained Transformers
Figure 3 for Large Models are Parsimonious Learners: Activation Sparsity in Trained Transformers
Figure 4 for Large Models are Parsimonious Learners: Activation Sparsity in Trained Transformers
Viaarxiv icon

Decoupled Context Processing for Context Augmented Language Modeling

Add code
Oct 11, 2022
Figure 1 for Decoupled Context Processing for Context Augmented Language Modeling
Figure 2 for Decoupled Context Processing for Context Augmented Language Modeling
Figure 3 for Decoupled Context Processing for Context Augmented Language Modeling
Figure 4 for Decoupled Context Processing for Context Augmented Language Modeling
Viaarxiv icon