Learn Before
Regularization in GNN
Edge Dropout
In edge dropout, we randomly remove (or mask) edges in the adjacency matrix during training, with the intuition that this will make the GNN less prone to overfitting and more robust to noise in the adjacency matrix. This approach has been particularly successful in the application of GNNs to knowledge graphs, and it was an essential technique used in the original graph attention network (GAT) work.
0
1
3 years ago
Tags
Data Science
Related
Parameter Sharing
Edge Dropout