Learn Before
Concept

Graphs as Graphical Models

Taking a probabilistic view of graph data, we can assume that the graph structure we are given defines the dependencies between the different nodes. A graph G=(V,E)G=(V, E) defines a Markov random field: p(xv,zv)ΠvVΦ(xv,zv)Π(u,v)EΨ(zu,zv)p({x_{v}}, {z_{v}}) \propto \Pi _{v \in V} \Phi ({x_{v}}, {z_{v}}) \Pi_{(u, v)\in E} \Psi ({z_{u}}, {z_{v}}) This equation says that the distribution p(xv,zv)p({x_{v}}, {z_{v}}) over node features and node embeddings factorizes according to the graph structure. Thus we can assume that node features are determined by their latent embeddings, and assume that the latent embeddings for connected nodes are dependent on each other.

0

1

Updated 2022-07-15

Tags

Data Science