Picture for Tao Lin

Tao Lin

User-Creator Feature Dynamics in Recommender Systems with Dual Influence

Add code
Jul 19, 2024
Figure 1 for User-Creator Feature Dynamics in Recommender Systems with Dual Influence
Figure 2 for User-Creator Feature Dynamics in Recommender Systems with Dual Influence
Figure 3 for User-Creator Feature Dynamics in Recommender Systems with Dual Influence
Figure 4 for User-Creator Feature Dynamics in Recommender Systems with Dual Influence
Viaarxiv icon

Leveraging large language models for nano synthesis mechanism explanation: solid foundations or mere conjectures?

Add code
Jul 12, 2024
Viaarxiv icon

Increasing Model Capacity for Free: A Simple Strategy for Parameter Efficient Fine-tuning

Add code
Jul 01, 2024
Viaarxiv icon

PathGen-1.6M: 1.6 Million Pathology Image-text Pairs Generation through Multi-agent Collaboration

Add code
Jun 28, 2024
Figure 1 for PathGen-1.6M: 1.6 Million Pathology Image-text Pairs Generation through Multi-agent Collaboration
Figure 2 for PathGen-1.6M: 1.6 Million Pathology Image-text Pairs Generation through Multi-agent Collaboration
Figure 3 for PathGen-1.6M: 1.6 Million Pathology Image-text Pairs Generation through Multi-agent Collaboration
Figure 4 for PathGen-1.6M: 1.6 Million Pathology Image-text Pairs Generation through Multi-agent Collaboration
Viaarxiv icon

Cognitive Insights and Stable Coalition Matching for Fostering Multi-Agent Cooperation

Add code
May 28, 2024
Viaarxiv icon

Client2Vec: Improving Federated Learning by Distribution Shifts Aware Client Indexing

Add code
May 25, 2024
Figure 1 for Client2Vec: Improving Federated Learning by Distribution Shifts Aware Client Indexing
Figure 2 for Client2Vec: Improving Federated Learning by Distribution Shifts Aware Client Indexing
Figure 3 for Client2Vec: Improving Federated Learning by Distribution Shifts Aware Client Indexing
Figure 4 for Client2Vec: Improving Federated Learning by Distribution Shifts Aware Client Indexing
Viaarxiv icon

Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models

Add code
May 23, 2024
Figure 1 for Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models
Figure 2 for Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models
Figure 3 for Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models
Figure 4 for Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models
Viaarxiv icon

Efficiency for Free: Ideal Data Are Transportable Representations

Add code
May 23, 2024
Figure 1 for Efficiency for Free: Ideal Data Are Transportable Representations
Figure 2 for Efficiency for Free: Ideal Data Are Transportable Representations
Figure 3 for Efficiency for Free: Ideal Data Are Transportable Representations
Figure 4 for Efficiency for Free: Ideal Data Are Transportable Representations
Viaarxiv icon

GIFT: Unlocking Full Potential of Labels in Distilled Dataset at Near-zero Cost

Add code
May 23, 2024
Figure 1 for GIFT: Unlocking Full Potential of Labels in Distilled Dataset at Near-zero Cost
Figure 2 for GIFT: Unlocking Full Potential of Labels in Distilled Dataset at Near-zero Cost
Figure 3 for GIFT: Unlocking Full Potential of Labels in Distilled Dataset at Near-zero Cost
Figure 4 for GIFT: Unlocking Full Potential of Labels in Distilled Dataset at Near-zero Cost
Viaarxiv icon

Open-Source AI-based SE Tools: Opportunities and Challenges of Collaborative Software Learning

Add code
Apr 09, 2024
Viaarxiv icon