Learn Before
Concept
Study from Dang et al.
Attention mechanisms have also been applied to further improve the training process of the detection systems. Dang et al. carried out a complete analysis of different types of facial manipulations. They proposed to use attention mechanisms and popular CNN models such as XceptionNet and VGG16. For the entire face synthesis manipulation, the authors achieved a final 100% AUC and around 0.1% EER considering real faces from CelebA, FFHQ, and FaceForensics++ databases and fake images created through ProGAN and StyleGAN approaches.
0
1
Updated 2021-08-13
Tags
Data Science