Learn Before
Concept
Edge Dropout
In edge dropout, we randomly remove (or mask) edges in the adjacency matrix during training, with the intuition that this will make the GNN less prone to overfitting and more robust to noise in the adjacency matrix. This approach has been particularly successful in the application of GNNs to knowledge graphs, and it was an essential technique used in the original graph attention network (GAT) work.
0
1
Updated 2022-07-15
Tags
Data Science