Alert button
Picture for Mingyi Zhang

Mingyi Zhang

Alert button

Neuro-Symbolic Recommendation Model based on Logic Query

Sep 14, 2023
Maonian Wu, Bang Chen, Shaojun Zhu, Bo Zheng, Wei Peng, Mingyi Zhang

A recommendation system assists users in finding items that are relevant to them. Existing recommendation models are primarily based on predicting relationships between users and items and use complex matching models or incorporate extensive external information to capture association patterns in data. However, recommendation is not only a problem of inductive statistics using data; it is also a cognitive task of reasoning decisions based on knowledge extracted from information. Hence, a logic system could naturally be incorporated for the reasoning in a recommendation task. However, although hard-rule approaches based on logic systems can provide powerful reasoning ability, they struggle to cope with inconsistent and incomplete knowledge in real-world tasks, especially for complex tasks such as recommendation. Therefore, in this paper, we propose a neuro-symbolic recommendation model, which transforms the user history interactions into a logic expression and then transforms the recommendation prediction into a query task based on this logic expression. The logic expressions are then computed based on the modular logic operations of the neural network. We also construct an implicit logic encoder to reasonably reduce the complexity of the logic computation. Finally, a user's interest items can be queried in the vector space based on the computation results. Experiments on three well-known datasets verified that our method performs better compared to state of the art shallow, deep, session, and reasoning models.

* 17 pages, 6 figures 
Viaarxiv icon

Delving into Transformer for Incremental Semantic Segmentation

Nov 18, 2022
Zekai Xu, Mingyi Zhang, Jiayue Hou, Xing Gong, Chuan Wen, Chengjie Wang, Junge Zhang

Figure 1 for Delving into Transformer for Incremental Semantic Segmentation
Figure 2 for Delving into Transformer for Incremental Semantic Segmentation
Figure 3 for Delving into Transformer for Incremental Semantic Segmentation
Figure 4 for Delving into Transformer for Incremental Semantic Segmentation

Incremental semantic segmentation(ISS) is an emerging task where old model is updated by incrementally adding new classes. At present, methods based on convolutional neural networks are dominant in ISS. However, studies have shown that such methods have difficulty in learning new tasks while maintaining good performance on old ones (catastrophic forgetting). In contrast, a Transformer based method has a natural advantage in curbing catastrophic forgetting due to its ability to model both long-term and short-term tasks. In this work, we explore the reasons why Transformer based architecture are more suitable for ISS, and accordingly propose propose TISS, a Transformer based method for Incremental Semantic Segmentation. In addition, to better alleviate catastrophic forgetting while preserving transferability on ISS, we introduce two patch-wise contrastive losses to imitate similar features and enhance feature diversity respectively, which can further improve the performance of TISS. Under extensive experimental settings with Pascal-VOC 2012 and ADE20K datasets, our method significantly outperforms state-of-the-art incremental semantic segmentation methods.

Viaarxiv icon