Concept

Encoder For Graph-Level Latents

For any GNN model, the encoder can be modified to be a graph level variational encoder by adding a pooling layer, i.e.,

μzG=POOLμ(GNNμ(A,X))\mu_{\mathbf{z}_G}=POOL_{\mu}(GNN_{\mu} (\mathbf{A}, \mathbf{X})), logσzG=POOLσ(GNNσ(A,X))log \sigma_{\mathbf{z}_G}=POOL_{\sigma}(GNN_{\sigma} (\mathbf{A}, \mathbf{X}))

Where POOL:RV×dRdPOOL: \mathbb{R}^{|V|\times d}\rightarrow \mathbb{R}^d. Here we use two different GNNs to parameterize the mean and variance of posterior distribution, and we define a posterior for each single graph instead of each single node.

0

1

Updated 2022-07-24

Contributors are:

Who are from:

Tags

Deep Learning (in Machine learning)

Data Science