Memory-Enhanced Transformer for Representation Learning on Temporal Heterogeneous Graphs
Temporal heterogeneous graphs can model lots of complex systems in the real world, such as social networks and e-commerce applications, which are naturally time-varying and heterogeneous. As most existing graph representation learning methods cannot efficiently handle both of these characteristics,...
Saved in:
Published in | Data Science and Engineering Vol. 8; no. 2; pp. 98 - 111 |
---|---|
Main Authors | , , , , , , , |
Format | Journal Article |
Language | English |
Published |
Singapore
Springer Nature Singapore
01.06.2023
Springer Springer Nature B.V SpringerOpen |
Subjects | |
Online Access | Get full text |
ISSN | 2364-1185 2364-1541 2364-1541 |
DOI | 10.1007/s41019-023-00207-w |
Cover
Summary: | Temporal heterogeneous graphs can model lots of complex systems in the real world, such as social networks and e-commerce applications, which are naturally time-varying and heterogeneous. As most existing graph representation learning methods cannot efficiently handle both of these characteristics, we propose a Transformer-like representation learning model, named THAN, to learn low-dimensional node embeddings preserving the topological structure features, heterogeneous semantics, and dynamic patterns of temporal heterogeneous graphs, simultaneously. Specifically, THAN first samples heterogeneous neighbors with temporal constraints and projects node features into the same vector space, then encodes time information and aggregates the neighborhood influence in different weights via type-aware self-attention. To capture long-term dependencies and evolutionary patterns, we design an optional memory module for storing and evolving dynamic node representations. Experiments on three real-world datasets demonstrate that THAN outperforms the state-of-the-arts in terms of effectiveness with respect to the temporal link prediction task. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 2364-1185 2364-1541 2364-1541 |
DOI: | 10.1007/s41019-023-00207-w |