Alert button
Picture for Anima Singh

Anima Singh

Alert button

Density Weighting for Multi-Interest Personalized Recommendation

Aug 03, 2023
Nikhil Mehta, Anima Singh, Xinyang Yi, Sagar Jain, Lichan Hong, Ed H. Chi

Figure 1 for Density Weighting for Multi-Interest Personalized Recommendation
Figure 2 for Density Weighting for Multi-Interest Personalized Recommendation
Figure 3 for Density Weighting for Multi-Interest Personalized Recommendation
Figure 4 for Density Weighting for Multi-Interest Personalized Recommendation
Viaarxiv icon

Better Generalization with Semantic IDs: A case study in Ranking for Recommendations

Jun 13, 2023
Anima Singh, Trung Vu, Raghunandan Keshavan, Nikhil Mehta, Xinyang Yi, Lichan Hong, Lukasz Heldt, Li Wei, Ed Chi, Maheswaran Sathiamoorthy

Figure 1 for Better Generalization with Semantic IDs: A case study in Ranking for Recommendations
Figure 2 for Better Generalization with Semantic IDs: A case study in Ranking for Recommendations
Figure 3 for Better Generalization with Semantic IDs: A case study in Ranking for Recommendations
Figure 4 for Better Generalization with Semantic IDs: A case study in Ranking for Recommendations
Viaarxiv icon

Recommender Systems with Generative Retrieval

May 08, 2023
Shashank Rajput, Nikhil Mehta, Anima Singh, Raghunandan H. Keshavan, Trung Vu, Lukasz Heldt, Lichan Hong, Yi Tay, Vinh Q. Tran, Jonah Samost, Maciej Kula, Ed H. Chi, Maheswaran Sathiamoorthy

Figure 1 for Recommender Systems with Generative Retrieval
Figure 2 for Recommender Systems with Generative Retrieval
Figure 3 for Recommender Systems with Generative Retrieval
Figure 4 for Recommender Systems with Generative Retrieval
Viaarxiv icon

Understanding and Improving Knowledge Distillation

Feb 10, 2020
Jiaxi Tang, Rakesh Shivanna, Zhe Zhao, Dong Lin, Anima Singh, Ed H. Chi, Sagar Jain

Figure 1 for Understanding and Improving Knowledge Distillation
Figure 2 for Understanding and Improving Knowledge Distillation
Figure 3 for Understanding and Improving Knowledge Distillation
Figure 4 for Understanding and Improving Knowledge Distillation
Viaarxiv icon