Picture for Ang Li

Ang Li

SiDA: Sparsity-Inspired Data-Aware Serving for Efficient and Scalable Large Mixture-of-Experts Models

Add code
Oct 29, 2023
Figure 1 for SiDA: Sparsity-Inspired Data-Aware Serving for Efficient and Scalable Large Mixture-of-Experts Models
Figure 2 for SiDA: Sparsity-Inspired Data-Aware Serving for Efficient and Scalable Large Mixture-of-Experts Models
Figure 3 for SiDA: Sparsity-Inspired Data-Aware Serving for Efficient and Scalable Large Mixture-of-Experts Models
Figure 4 for SiDA: Sparsity-Inspired Data-Aware Serving for Efficient and Scalable Large Mixture-of-Experts Models
Viaarxiv icon

Adversarial Examples Are Not Real Features

Add code
Oct 29, 2023
Figure 1 for Adversarial Examples Are Not Real Features
Figure 2 for Adversarial Examples Are Not Real Features
Figure 3 for Adversarial Examples Are Not Real Features
Figure 4 for Adversarial Examples Are Not Real Features
Viaarxiv icon

Building an Open-Vocabulary Video CLIP Model with Better Architectures, Optimization and Data

Add code
Oct 08, 2023
Figure 1 for Building an Open-Vocabulary Video CLIP Model with Better Architectures, Optimization and Data
Figure 2 for Building an Open-Vocabulary Video CLIP Model with Better Architectures, Optimization and Data
Figure 3 for Building an Open-Vocabulary Video CLIP Model with Better Architectures, Optimization and Data
Figure 4 for Building an Open-Vocabulary Video CLIP Model with Better Architectures, Optimization and Data
Viaarxiv icon

FedHyper: A Universal and Robust Learning Rate Scheduler for Federated Learning with Hypergradient Descent

Add code
Oct 06, 2023
Figure 1 for FedHyper: A Universal and Robust Learning Rate Scheduler for Federated Learning with Hypergradient Descent
Figure 2 for FedHyper: A Universal and Robust Learning Rate Scheduler for Federated Learning with Hypergradient Descent
Figure 3 for FedHyper: A Universal and Robust Learning Rate Scheduler for Federated Learning with Hypergradient Descent
Figure 4 for FedHyper: A Universal and Robust Learning Rate Scheduler for Federated Learning with Hypergradient Descent
Viaarxiv icon

FedNAR: Federated Optimization with Normalized Annealing Regularization

Add code
Oct 04, 2023
Figure 1 for FedNAR: Federated Optimization with Normalized Annealing Regularization
Figure 2 for FedNAR: Federated Optimization with Normalized Annealing Regularization
Figure 3 for FedNAR: Federated Optimization with Normalized Annealing Regularization
Figure 4 for FedNAR: Federated Optimization with Normalized Annealing Regularization
Viaarxiv icon

AntM$^{2}$C: A Large Scale Dataset For Multi-Scenario Multi-Modal CTR Prediction

Add code
Aug 31, 2023
Figure 1 for AntM$^{2}$C: A Large Scale Dataset For Multi-Scenario Multi-Modal CTR Prediction
Figure 2 for AntM$^{2}$C: A Large Scale Dataset For Multi-Scenario Multi-Modal CTR Prediction
Figure 3 for AntM$^{2}$C: A Large Scale Dataset For Multi-Scenario Multi-Modal CTR Prediction
Figure 4 for AntM$^{2}$C: A Large Scale Dataset For Multi-Scenario Multi-Modal CTR Prediction
Viaarxiv icon

Block-Level Interference Exploitation Precoding for MU-MISO: An ADMM Approach

Add code
Aug 30, 2023
Figure 1 for Block-Level Interference Exploitation Precoding for MU-MISO: An ADMM Approach
Figure 2 for Block-Level Interference Exploitation Precoding for MU-MISO: An ADMM Approach
Figure 3 for Block-Level Interference Exploitation Precoding for MU-MISO: An ADMM Approach
Figure 4 for Block-Level Interference Exploitation Precoding for MU-MISO: An ADMM Approach
Viaarxiv icon

AutoReP: Automatic ReLU Replacement for Fast Private Network Inference

Add code
Aug 20, 2023
Figure 1 for AutoReP: Automatic ReLU Replacement for Fast Private Network Inference
Figure 2 for AutoReP: Automatic ReLU Replacement for Fast Private Network Inference
Figure 3 for AutoReP: Automatic ReLU Replacement for Fast Private Network Inference
Figure 4 for AutoReP: Automatic ReLU Replacement for Fast Private Network Inference
Viaarxiv icon

Symbol-Level Precoding for MU-MIMO System with RIRC Receiver

Add code
Jul 27, 2023
Figure 1 for Symbol-Level Precoding for MU-MIMO System with RIRC Receiver
Figure 2 for Symbol-Level Precoding for MU-MIMO System with RIRC Receiver
Figure 3 for Symbol-Level Precoding for MU-MIMO System with RIRC Receiver
Figure 4 for Symbol-Level Precoding for MU-MIMO System with RIRC Receiver
Viaarxiv icon

A Novel Spatial-Temporal Variational Quantum Circuit to Enable Deep Learning on NISQ Devices

Add code
Jul 19, 2023
Figure 1 for A Novel Spatial-Temporal Variational Quantum Circuit to Enable Deep Learning on NISQ Devices
Figure 2 for A Novel Spatial-Temporal Variational Quantum Circuit to Enable Deep Learning on NISQ Devices
Figure 3 for A Novel Spatial-Temporal Variational Quantum Circuit to Enable Deep Learning on NISQ Devices
Figure 4 for A Novel Spatial-Temporal Variational Quantum Circuit to Enable Deep Learning on NISQ Devices
Viaarxiv icon