Alert button
Picture for Danni Peng

Danni Peng

Alert button

Learning Gradient-based Mixup towards Flatter Minima for Domain Generalization

Sep 29, 2022
Danni Peng, Sinno Jialin Pan

Figure 1 for Learning Gradient-based Mixup towards Flatter Minima for Domain Generalization
Figure 2 for Learning Gradient-based Mixup towards Flatter Minima for Domain Generalization
Figure 3 for Learning Gradient-based Mixup towards Flatter Minima for Domain Generalization
Figure 4 for Learning Gradient-based Mixup towards Flatter Minima for Domain Generalization

To address the distribution shifts between training and test data, domain generalization (DG) leverages multiple source domains to learn a model that generalizes well to unseen domains. However, existing DG methods generally suffer from overfitting to the source domains, partly due to the limited coverage of the expected region in feature space. Motivated by this, we propose to perform mixup with data interpolation and extrapolation to cover the potential unseen regions. To prevent the detrimental effects of unconstrained extrapolation, we carefully design a policy to generate the instance weights, named Flatness-aware Gradient-based Mixup (FGMix). The policy employs a gradient-based similarity to assign greater weights to instances that carry more invariant information, and learns the similarity function towards flatter minima for better generalization. On the DomainBed benchmark, we validate the efficacy of various designs of FGMix and demonstrate its superiority over other DG algorithms.

* 22 pages, 14 figures 
Viaarxiv icon

Learning an Adaptive Meta Model-Generator for Incrementally Updating Recommender Systems

Nov 08, 2021
Danni Peng, Sinno Jialin Pan, Jie Zhang, Anxiang Zeng

Figure 1 for Learning an Adaptive Meta Model-Generator for Incrementally Updating Recommender Systems
Figure 2 for Learning an Adaptive Meta Model-Generator for Incrementally Updating Recommender Systems
Figure 3 for Learning an Adaptive Meta Model-Generator for Incrementally Updating Recommender Systems
Figure 4 for Learning an Adaptive Meta Model-Generator for Incrementally Updating Recommender Systems

Recommender Systems (RSs) in real-world applications often deal with billions of user interactions daily. To capture the most recent trends effectively, it is common to update the model incrementally using only the newly arrived data. However, this may impede the model's ability to retain long-term information due to the potential overfitting and forgetting issues. To address this problem, we propose a novel Adaptive Sequential Model Generation (ASMG) framework, which generates a better serving model from a sequence of historical models via a meta generator. For the design of the meta generator, we propose to employ Gated Recurrent Units (GRUs) to leverage its ability to capture the long-term dependencies. We further introduce some novel strategies to apply together with the GRU meta generator, which not only improve its computational efficiency but also enable more accurate sequential modeling. By instantiating the model-agnostic framework on a general deep learning-based RS model, we demonstrate that our method achieves state-of-the-art performance on three public datasets and one industrial dataset.

* 11 pages, 6 figures, accepted by RecSys 2021 
Viaarxiv icon