Learn Before
Concept
Over-Smoothing As A Low-Pass Convolutional Filter
Suppose we have a simplified GNN:
If k is big enough such that we reach an fix point, then we’ll have . At this fixed point, all nodes will converge to be defined by the dominant eigenvector of .
So we can see that stacking many rounds of message passing leads to low-pass convolutional filter, and all nodes can become to be identical and uninformative.
0
1
Updated 2022-07-17
Tags
Deep Learning (in Machine learning)
Data Science