Deepfake Labels Restore Reality, Especially for Those Who Dislike the Speaker
Authors: Nathan L. Tenhundfeld, Ryan Weber, William I. MacKenzie, Hannah M. Barr, Candice Lanius
Published: 2024-03-29 17:11:13+00:00
AI Summary
This study investigated whether labeling deepfake videos of President Biden as such improved participants' ability to distinguish them from real videos. Participants accurately recalled 93.8% of deepfake videos and 84.2% of actual videos, suggesting that labeling videos can aid in combating misinformation.
Abstract
Deepfake videos create dangerous possibilities for public misinformation. In this experiment (N=204), we investigated whether labeling videos as containing actual or deepfake statements from US President Biden helps participants later differentiate between true and fake information. People accurately recalled 93.8% of deepfake videos and 84.2% of actual videos, suggesting that labeling videos can help combat misinformation. Individuals who identify as Republican and had lower favorability ratings of Biden performed better in distinguishing between actual and deepfake videos, a result explained by the elaboration likelihood model (ELM), which predicts that people who distrust a message source will more critically evaluate the message.