Abstract:Large language models have been extensively studied as neural knowledge bases for their knowledge access, editability, reasoning, and explainability. However, few works focus on the structural patterns of their knowledge. Motivated by this gap, we investigate these structural patterns from a graph perspective. We quantify the knowledge of LLMs at both the triplet and entity levels, and analyze how it relates to graph structural properties such as node degree. Furthermore, we uncover the knowledge homophily, where topologically close entities exhibit similar levels of knowledgeability, which further motivates us to develop graph machine learning models to estimate entity knowledge based on its local neighbors. This model further enables valuable knowledge checking by selecting triplets less known to LLMs. Empirical results show that using selected triplets for fine-tuning leads to superior performance.
Abstract:Temporal knowledge graph (TKG) reasoning predicts future events based on historical data, but it's challenging due to the complex semantic and hierarchical information involved. Existing Euclidean models excel at capturing semantics but struggle with hierarchy. Conversely, hyperbolic models manage hierarchical features well but fail to represent complex semantics due to limitations in shallow models' parameters and the absence of proper normalization in deep models relying on the L2 norm. Current solutions, as curvature transformations, are insufficient to address these issues. In this work, a novel hybrid geometric space approach that leverages the strengths of both Euclidean and hyperbolic models is proposed. Our approach transitions from single-space to multi-space parameter modeling, effectively capturing both semantic and hierarchical information. Initially, complex semantics are captured through a fact co-occurrence and autoregressive method with normalizations in Euclidean space. The embeddings are then transformed into Tangent space using a scaling mechanism, preserving semantic information while relearning hierarchical structures through a query-candidate separated modeling approach, which are subsequently transformed into Hyperbolic space. Finally, a hybrid inductive bias for hierarchical and semantic learning is achieved by combining hyperbolic and Euclidean scoring functions through a learnable query-specific mixing coefficient, utilizing embeddings from hyperbolic and Euclidean spaces. Experimental results on four TKG benchmarks demonstrate that our method reduces error relatively by up to 15.0% in mean reciprocal rank on YAGO compared to previous single-space models. Additionally, enriched visualization analysis validates the effectiveness of our approach, showing adaptive capabilities for datasets with varying levels of semantic and hierarchical complexity.