TruFor: Leveraging all-round clues for trustworthy image forgery detection and localization

Authors: Fabrizio Guillaro, Davide Cozzolino, Avneesh Sud, Nicholas Dufour, Luisa Verdoliva

Published: 2022-12-21 11:49:43+00:00

AI Summary

TruFor is a forensic framework for image forgery detection and localization that leverages both high-level and low-level image features. It uses a transformer-based fusion architecture combining RGB image data with a learned noise-sensitive fingerprint (Noiseprint++) to detect forgeries as deviations from expected patterns, achieving state-of-the-art performance.

Abstract

In this paper we present TruFor, a forensic framework that can be applied to a large variety of image manipulation methods, from classic cheapfakes to more recent manipulations based on deep learning. We rely on the extraction of both high-level and low-level traces through a transformer-based fusion architecture that combines the RGB image and a learned noise-sensitive fingerprint. The latter learns to embed the artifacts related to the camera internal and external processing by training only on real data in a self-supervised manner. Forgeries are detected as deviations from the expected regular pattern that characterizes each pristine image. Looking for anomalies makes the approach able to robustly detect a variety of local manipulations, ensuring generalization. In addition to a pixel-level localization map and a whole-image integrity score, our approach outputs a reliability map that highlights areas where localization predictions may be error-prone. This is particularly important in forensic applications in order to reduce false alarms and allow for a large scale analysis. Extensive experiments on several datasets show that our method is able to reliably detect and localize both cheapfakes and deepfakes manipulations outperforming state-of-the-art works. Code is publicly available at https://grip-unina.github.io/TruFor/


Key findings
TruFor outperforms state-of-the-art methods in both forgery detection and localization across diverse datasets, including those with GAN- and diffusion-based manipulations. The incorporation of Noiseprint++ improves robustness to image processing, and the confidence map reduces false alarms.
Approach
TruFor extracts a learned noise-sensitive fingerprint (Noiseprint++) from the input image, which is then combined with RGB image data in a transformer-based fusion architecture. Anomaly and confidence maps are generated, and a forgery detector uses these maps to produce a global integrity score and a pixel-level localization map.
Datasets
Flickr, DPReview, CASIA v1, Coverage, Columbia, NIST16, DSO-1, VIPP, OpenForensics, CocoGlide (a dataset created by the authors using GLIDE diffusion model and COCO 2017 validation set). Also used CASIA v2, FantasticReality, IMD2020, and a dataset of manipulated images created by [26] for training.
Model(s)
DnCNN (for Noiseprint++ extraction), SegFormer (for anomaly localization), a custom architecture combining CM-FRM and FFM modules (for fusion), and a fully connected network (for forgery detection).
Author countries
Italy, United States