Learn Before
Concept

Pretraining a GNN

  • GNN pre-training approaches is an open and active area of research.

  • Pretraining is an effective way of injecting domain knowledge into a model before the actual training take place

  • Pre-training a GNN using neighborhood reconstruction loss to reconstruct missing edges before a classification task does not improve the classification loss.

  • A pre-training method called Deep Graph Infomax maximizes the mutual information between node embeddings zuz_u and graph embeddings zG z_{\mathcal G}.

0

1

Updated 2022-07-17

Tags

Data Science