Alert button
Picture for Guilin Li

Guilin Li

Alert button

Tencent, WeChat Pay

Generative Pretraining at Scale: Transformer-Based Encoding of Transactional Behavior for Fraud Detection

Add code
Bookmark button
Alert button
Dec 22, 2023
Ze Yu Zhao, Zheng Zhu, Guilin Li, Wenhan Wang, Bo Wang

Viaarxiv icon

Realization Scheme for Visual Cryptography with Computer-generated Holograms

Add code
Bookmark button
Alert button
Dec 10, 2022
Tao Yu, Jinge Ma, Guilin Li, Dongyu Yang, Rui Ma, Yishi Shi

Figure 1 for Realization Scheme for Visual Cryptography with Computer-generated Holograms
Figure 2 for Realization Scheme for Visual Cryptography with Computer-generated Holograms
Viaarxiv icon

CROLoss: Towards a Customizable Loss for Retrieval Models in Recommender Systems

Add code
Bookmark button
Alert button
Aug 05, 2022
Yongxiang Tang, Wentao Bai, Guilin Li, Xialong Liu, Yu Zhang

Figure 1 for CROLoss: Towards a Customizable Loss for Retrieval Models in Recommender Systems
Figure 2 for CROLoss: Towards a Customizable Loss for Retrieval Models in Recommender Systems
Figure 3 for CROLoss: Towards a Customizable Loss for Retrieval Models in Recommender Systems
Figure 4 for CROLoss: Towards a Customizable Loss for Retrieval Models in Recommender Systems
Viaarxiv icon

DropNAS: Grouped Operation Dropout for Differentiable Architecture Search

Add code
Bookmark button
Alert button
Jan 27, 2022
Weijun Hong, Guilin Li, Weinan Zhang, Ruiming Tang, Yunhe Wang, Zhenguo Li, Yong Yu

Figure 1 for DropNAS: Grouped Operation Dropout for Differentiable Architecture Search
Figure 2 for DropNAS: Grouped Operation Dropout for Differentiable Architecture Search
Figure 3 for DropNAS: Grouped Operation Dropout for Differentiable Architecture Search
Figure 4 for DropNAS: Grouped Operation Dropout for Differentiable Architecture Search
Viaarxiv icon

Relaxed Conditional Image Transfer for Semi-supervised Domain Adaptation

Add code
Bookmark button
Alert button
Jan 05, 2021
Qijun Luo, Zhili Liu, Lanqing Hong, Chongxuan Li, Kuo Yang, Liyuan Wang, Fengwei Zhou, Guilin Li, Zhenguo Li, Jun Zhu

Figure 1 for Relaxed Conditional Image Transfer for Semi-supervised Domain Adaptation
Figure 2 for Relaxed Conditional Image Transfer for Semi-supervised Domain Adaptation
Figure 3 for Relaxed Conditional Image Transfer for Semi-supervised Domain Adaptation
Figure 4 for Relaxed Conditional Image Transfer for Semi-supervised Domain Adaptation
Viaarxiv icon

AutoFIS: Automatic Feature Interaction Selection in Factorization Models for Click-Through Rate Prediction

Add code
Bookmark button
Alert button
Mar 26, 2020
Bin Liu, Chenxu Zhu, Guilin Li, Weinan Zhang, Jincai Lai, Ruiming Tang, Xiuqiang He, Zhenguo Li, Yong Yu

Figure 1 for AutoFIS: Automatic Feature Interaction Selection in Factorization Models for Click-Through Rate Prediction
Figure 2 for AutoFIS: Automatic Feature Interaction Selection in Factorization Models for Click-Through Rate Prediction
Figure 3 for AutoFIS: Automatic Feature Interaction Selection in Factorization Models for Click-Through Rate Prediction
Figure 4 for AutoFIS: Automatic Feature Interaction Selection in Factorization Models for Click-Through Rate Prediction
Viaarxiv icon

StacNAS: Towards stable and consistent optimization for differentiable Neural Architecture Search

Add code
Bookmark button
Alert button
Oct 01, 2019
Guilin Li, Xing Zhang, Zitong Wang, Zhenguo Li, Tong Zhang

Figure 1 for StacNAS: Towards stable and consistent optimization for differentiable Neural Architecture Search
Figure 2 for StacNAS: Towards stable and consistent optimization for differentiable Neural Architecture Search
Figure 3 for StacNAS: Towards stable and consistent optimization for differentiable Neural Architecture Search
Figure 4 for StacNAS: Towards stable and consistent optimization for differentiable Neural Architecture Search
Viaarxiv icon

Revisit Knowledge Distillation: a Teacher-free Framework

Add code
Bookmark button
Alert button
Sep 25, 2019
Li Yuan, Francis E. H. Tay, Guilin Li, Tao Wang, Jiashi Feng

Figure 1 for Revisit Knowledge Distillation: a Teacher-free Framework
Figure 2 for Revisit Knowledge Distillation: a Teacher-free Framework
Figure 3 for Revisit Knowledge Distillation: a Teacher-free Framework
Figure 4 for Revisit Knowledge Distillation: a Teacher-free Framework
Viaarxiv icon