Deep Fake Video Detection Using Transfer Learning Approach

The usage of the internet as a fast medium for spreading fake news reinforces the requirement of computational utensils in order to fight for it. Fake videos also called deep fakes that create great intimidation in society in an assortment of social and political behaviour. It can also be utilized f...

Full description

Saved in:
Bibliographic Details
Published inArabian journal for science and engineering Vol. 48; no. 8; pp. 9727 - 9737
Main Authors Suratkar, Shraddha, Kazi, Faruk
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.08.2023
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN2193-567X
1319-8025
2191-4281
2191-4281
DOI10.1007/s13369-022-07321-3

Cover

More Information
Summary:The usage of the internet as a fast medium for spreading fake news reinforces the requirement of computational utensils in order to fight for it. Fake videos also called deep fakes that create great intimidation in society in an assortment of social and political behaviour. It can also be utilized for malevolent intentions. Owing to the availability of deep fake generation algorithms at cheap computation power in cloud platforms, realistic fake videos or images are created. However, it is more critical to detect fake content because of the increased complexity of leveraging various approaches to smudge the tampering. Therefore, this work proposes a novel framework to detect fake videos through the utilization of transfer learning in autoencoders and a hybrid model of convolutional neural networks (CNN) and Recurrent neural networks (RNN). Unseen test input data are investigated to check the generalizability of the model. Also, the effect of residual image input on accuracy of the model is analyzed. Results are presented for both, with and without transfer learning to validate the effectiveness of transfer learning.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2193-567X
1319-8025
2191-4281
2191-4281
DOI:10.1007/s13369-022-07321-3