Concept

GNN Without Message Passing

Several recent GNN model proposed that we can remove the message passing to simplfy the GNN, and these models can be generally defined as:

Z=MLPθ(f(A)MLPϕ(X))\mathbf{Z} = MLP_{\theta}(f(\mathbf{A}) MLP_{\phi}(\mathbf{X}) )

Where MLPMLP denote dense neural network, ff is some deterministic function. For example, Wu define ff as:

f(A)=A~kf(\mathbf{A}) = \tilde{\mathbf{A}}^k

The intuition is that we don’t have to involve trainable parameters in convolution layer. Rather, we can apply a dense layer at the start and end of layers, and use a deterministic convolutional layer in the middle to use graph structure. And it has been prove to outperform parameterized message passing models on many classification benchmarks.

0

1

Updated 2022-07-17

Contributors are:

Who are from:

Tags

Deep Learning (in Machine learning)

Data Science