Learn Before
Concept
Breaking The Bottleneck Of Message Passing
Message passing is a fundamental paradigm in current GNN and graph representation learning. However, it has several drawbacks. Theoretically, the power of message passing is bounded by Weisfeiler-Lehman (WL) isomorphism test, is limited to simple convolution filters, and is restricted to tree-structured computation graphs. Empirically, researchers find that message passing suffer from over-smoothing, which is a result of neighbor aggregation. The author believe that there is deeper connections between these problems, and a new paradigm is needed to overcome the theoretical bottle neck.
0
1
Updated 2022-07-31
Tags
Deep Learning (in Machine learning)
Data Science