Concept

Adversarial Attacks

Adversarial Attack is a type of adversarial method that “fools” the discriminator by adding small perturbation to the original data. For example, Nettack modifies graph structures and nodes attributes, and uses a loss function that aims to find the best legitimate change to cause a node to be misclassified.

0

1

Updated 2022-06-12

Contributors are:

Who are from:

Tags

Deep Learning (in Machine learning)

Data Science

Related