Picture for Pingzhi Li

Pingzhi Li

Examining Post-Training Quantization for Mixture-of-Experts: A Benchmark

Add code
Jun 12, 2024
Viaarxiv icon

Hybrid Quantum-Classical Scheduling for Accelerating Neural Network Training with Newton's Gradient Descent

Add code
Apr 30, 2024
Figure 1 for Hybrid Quantum-Classical Scheduling for Accelerating Neural Network Training with Newton's Gradient Descent
Figure 2 for Hybrid Quantum-Classical Scheduling for Accelerating Neural Network Training with Newton's Gradient Descent
Figure 3 for Hybrid Quantum-Classical Scheduling for Accelerating Neural Network Training with Newton's Gradient Descent
Figure 4 for Hybrid Quantum-Classical Scheduling for Accelerating Neural Network Training with Newton's Gradient Descent
Viaarxiv icon

Privacy-preserving Fine-tuning of Large Language Models through Flatness

Mar 07, 2024
Figure 1 for Privacy-preserving Fine-tuning of Large Language Models through Flatness
Figure 2 for Privacy-preserving Fine-tuning of Large Language Models through Flatness
Figure 3 for Privacy-preserving Fine-tuning of Large Language Models through Flatness
Figure 4 for Privacy-preserving Fine-tuning of Large Language Models through Flatness
Viaarxiv icon

Revisiting Zeroth-Order Optimization for Memory-Efficient LLM Fine-Tuning: A Benchmark

Add code
Feb 26, 2024
Figure 1 for Revisiting Zeroth-Order Optimization for Memory-Efficient LLM Fine-Tuning: A Benchmark
Figure 2 for Revisiting Zeroth-Order Optimization for Memory-Efficient LLM Fine-Tuning: A Benchmark
Figure 3 for Revisiting Zeroth-Order Optimization for Memory-Efficient LLM Fine-Tuning: A Benchmark
Figure 4 for Revisiting Zeroth-Order Optimization for Memory-Efficient LLM Fine-Tuning: A Benchmark
Viaarxiv icon

Merge, Then Compress: Demystify Efficient SMoE with Hints from Its Routing Policy

Add code
Oct 02, 2023
Figure 1 for Merge, Then Compress: Demystify Efficient SMoE with Hints from Its Routing Policy
Figure 2 for Merge, Then Compress: Demystify Efficient SMoE with Hints from Its Routing Policy
Figure 3 for Merge, Then Compress: Demystify Efficient SMoE with Hints from Its Routing Policy
Figure 4 for Merge, Then Compress: Demystify Efficient SMoE with Hints from Its Routing Policy
Viaarxiv icon