Good or Evil: Generative Adversarial Networks in Digital Forensics Book Chapter

Veksler, M, Akkaya, K. (2024). Good or Evil: Generative Adversarial Networks in Digital Forensics . 104 55-91. 10.1007/978-3-031-49803-9_3

cited authors

  • Veksler, M; Akkaya, K

authors

abstract

  • Generative adversarial networks (GANs) are widely used in different areas that enable the generation of new or modification of existing data. Specifically, GANs can be used to improve the quality of images and videos, generate synthetic data such as fictional characters and deep fakes, and apply data adaptation. In digital forensics, GANs can be used to improve the quality of machine learning (ML) algorithms. This is because the existing ML-based forensics techniques often lack robustness due to the limited availability of specialized data required for training and the need for prior knowledge about the potential attack. In these cases, GANs are employed to complement the existing datasets with new augmented synthetic samples within a short time frame and to improve the generalizing capability of forensic classifiers. At the same time, GANs contribute to new challenges when used as anti-forensic tools to fool classifiers. The adversary may either remove or modify crucial evidence present in data or create completely new data which closely reflects the original. In this work, we present a literature overview on the application of GANs in the digital forensics domain to demonstrate their versatility. We identify the architectural commonalities and trends of GANs used for anti-forensics in different domains. We also provide a categorization of GAN methods used to improve forensic classifiers based on their deployment. We use qualitative metrics to analyze GAN-based methods designed for both forensic and anti-forensic purposes to identify the potential future research directions based on discovered benefits and drawbacks.

publication date

  • January 1, 2024

Digital Object Identifier (DOI)

start page

  • 55

end page

  • 91

volume

  • 104