Memory-Efficient Information Filtering in Contrastive Learning for Temporal Knowledge Graph Reasoning
Authors: Fernando, E.K., Adjeisah, M., Chang, J., Zhang, J.J.
Conference: 2025 IEEE International Conference on Knowledge Graph (ICKG)
Dates: 13/11/2025
Publication Date: 26/02/2026
Pages: 82-89
Publisher: IEEE
DOI: 10.1109/ICKG66886.2025.00018
Abstract:Temporal knowledge graphs (TKGs) have emerged as a critical component in modern artificial intelligence systems, enabling machines to reason over dynamic information. However, existing methods for TKG reasoning use massive amounts of information and computational resources, failing to capture minimal essential temporal knowledge with their dynamics while eliminating irrelevant or noisy information for a more precise reasoning process. To this end, we introduce the Forest Fire Contrastive Approach (FFCA), a contrastive learning architecture based on forest fire sampling that presents a preferential attachment mechanism for the extrapolation of TKG. This allows high-degree nodes to attract new connections, improving the pipeline's predictive capability while keeping it compact, leading to an efficient learning and inference process. This approach introduces a degree-biased burn probability that gathers a minimal but highly correlated subgraph relevant to the query as the global view. Simultaneously, a sufficient number of the most recent snapshots were gathered as the local view, preserving task-relevant graph information and removing noise while reducing the computational complexity and memory requirements, improving the sustainability of TKG reasoning models. Experiments on three publicly available benchmark datasets widely used for TKG extrapolation tasks demonstrate that the proposed approach achieves competitive predictive performance to current state-of-the-art methods while demonstrating substantial improvements in memory efficiency, a critical consideration for scalability to large-scale temporal knowledge graph datasets.
Source: Manual