Texture, Shape, Order, and Relation Matter: A New Transformer Design for Sequential DeepFake Detection
Authors: Yunfei Li, Yuezun Li, Baoyuan Wu, Junyu Dong, Guopu Zhu, Siwei Lyu
Published: 2024-04-22 04:47:52+00:00
AI Summary
This paper proposes TSOM and TSOM++, novel Transformer designs for sequential deepfake detection. These models leverage texture, shape, and order of manipulations, along with a contrastive learning scheme (in TSOM++), to achieve superior performance compared to state-of-the-art methods.
Abstract
Sequential DeepFake detection is an emerging task that predicts the manipulation sequence in order. Existing methods typically formulate it as an image-to-sequence problem, employing conventional Transformer architectures. However, these methods lack dedicated design and consequently result in limited performance. As such, this paper describes a new Transformer design, called {TSOM}, by exploring three perspectives: Texture, Shape, and Order of Manipulations. Our method features four major improvements: ding{182} we describe a new texture-aware branch that effectively captures subtle manipulation traces with a Diversiform Pixel Difference Attention module. ding{183} Then we introduce a Multi-source Cross-attention module to seek deep correlations among spatial and sequential features, enabling effective modeling of complex manipulation traces. ding{184} To further enhance the cross-attention, we describe a Shape-guided Gaussian mapping strategy, providing initial priors of the manipulation shape. ding{185} Finally, observing that the subsequent manipulation in a sequence may influence traces left in the preceding one, we intriguingly invert the prediction order from forward to backward, leading to notable gains as expected. Building upon TSOM, we introduce an extended method, {TSOM++}, which additionally explores Relation of manipulations: ding{186} we propose a new sequential contrastive learning scheme to capture relationships between various manipulation types in sequence, further enhancing the detection of manipulation traces. We conduct extensive experiments in comparison with several state-of-the-art methods, demonstrating the superiority of our method. The code has been released at https://github.com/OUC-VAS/TSOM.