Publication: Multimodal networks
Program
KU-Authors
KU Authors
Co-Authors
Mark, James E
Publication Date
Language
Type
Embargo Status
Journal Title
Journal ISSN
Volume Title
Alternative Title
Abstract
The real world involves many graphs and networks that are essentially heterogeneous, in which various types of relations connect multiple types of vertices. With the development of information networks, node features can be described by data of different modalities, resulting in multimodal heterogeneous graphs. However, most existed methods can only handle unimodal heterogeneous graphs. Moreover, most existing heterogeneous graph mining methods are based on meta-paths that depend on domain experts for modeling. In this paper, we propose a novel multimodal heterogeneous graph attention network (MHGAT) to address these problems. Specifically, we exploit edge-level aggregation to capture graph heterogeneity information to achieve more informative representations adaptively. Further, we use the modality-level attention mechanism to obtain multimodal fusion information. Because plain graph convolutional networks can not capture higher-order neighborhood information, we utilize the residual connection and the dense connection access to obtain it. Extensive experimental results show that the MHGAT outperforms state-of-the-art baselines on three datasets for node classification, clustering, and visualization tasks.
Source
Publisher
Cambridge Univ Press
Subject
Chemistry, Applied chemistry, Polymer science
Citation
Has Part
Source
Rubberlike Elasticity: A Molecular Primer, 2nd Edition
Book Series Title
Edition
DOI
10.1017/CBO9780511541322.015