Engineering and technology
- Video communications
- Image and language processing
- Pattern recognition and neural networks
- Analogue and digital signal processing
Imagine a video of Zelenskyy on the internet in which he makes false statements. This would have devastating consequences, yet it could be fake. Is it really the President, or is it a fake? This proposal aims to answer this question. To detect manipulation, we exploit “compression history fingerprints”. This concept considers that fake regions typically underwent a different amount of compression steps, e.g., if a fake face was added to a video, then real regions are compressed twice, while fake regions are compressed once. Hence, manipulations are revealed by detecting inconsistencies in the traces of compression. Especially the last video compression step has the potential to reveal more information about the video’s origin than what is retrieved in the state of the art. In this project, we propose a video range decoder that not only decodes the last compression step, but also provides the range of values in which the encoded video sample values are located. We envision to overcome the two main limitations of the state of the art: lack of generalization and robustness. First, the proposal is not limited to a single type of forgery, but generalizes to forgeries found in the wild. Second, better robustness against recompression attacks is achieved by using a variety of strategies, namely the new video range decoder, deep and multi-task learning. Our research demonstrated great robustness in compression artifacts, which indicates a high potential for the proposed approach.