November 10, 2021

Bringing the dark side of deepfakes to light

Over the recent years, deepfake technology has made enormous progress. The technology has come a long way; in a few years we went from simple Snapchat face swap filters to entire (open source) software packages that allow real-time deepfakes. We covered real-time deepfakes in our previous blog.


What most people don't know is that the majority of deepfake videos are not the funny face swaps found on YouTube. Most of deepfake content on the open web is related to videos that are sexually explicit (also referred to as adult content). In these non-consensual videos  the face of a female celebrity has been deepfaked in the video, i.e. the face of the actual actress has been replaced with the face of a female celebrity. In fact, over 90% (!) of deepfakes on the internet is sexually explicit in nature and can be found in abundance on adult content websites.

In 2019 the likeness of a Dutch news anchor was deepfaked in an adult content video. According to this article, taking the video down was only possible “with the greatest possible effort". The list of female celebrities that have been deepfaked into an adult content video is basically endless. With the accessibility and democratization of deepfake generation tools, we expect the number of deepfake adult content videos to keep increasing. As deepfake technology becomes more accessible, it becomes easier to blackmail someone (for example an ex-girlfriend) with a deepfake adult content video. This is a very worrying development that could have far-reaching consequences. Apart from being illegal, it is humiliating to the victim and in countries where viewing and distributing adult content is illegal, the victim's own life may be endangered by such a video.


The legality of showing these videos on adult content platforms is debatable. Using someone's likeness without permission is illegal almost everywhere. In The Netherlands, for instance, making and spreading revenge porn is illegal. However, most adult content providers position themselves as search engines and therefore do not own the content. Consequently, such platforms cannot be held liable for non-consensual fake content they make available to a wide range of internet users. With hardly any legal precedent, no one knows the ins and outs of the matter. Luckily, this blog isn't about the legal aspect of deepfake adult content. 


On finding these deepfakes

This blog describes how we analyzed over 170,000 adult content video thumbnails and what we found. The project was carried out on behalf of a platform that used the analysis results to take deepfake content offline. This allows the platform to avoid reputation damage and possible legal issues. The aim was to quantify how many deepfake adult content videos are in the celebrity category. Picking this category seemed an obvious choice since almost all adult content deepfake videos feature a female celebrity in the video. 

We received a large CSV file from our partner in the adult content industry with over 170,00 URLs linking to the thumbnail of a video. These were single image thumbnails in JPG format. We downloaded all the thumbnails we could (some links were broken) to an external hard disk, and from there we could start our analysis. Our deepfake check pipeline consisted of the following three steps:

  1. Extract face(s) from the thumbnail. For this, we used open-source face detection algorithms. Most of the thumbnails actually did not contain a face (or the face detection algorithms did not find it). Thumbnails that did not contain a face were discarded and not analyzed further.
  2. Discard faces that are smaller than 128 x 128 px. A sensitivity analysis of our deepfake detection software has shown that this is about the minimum resolution for reliable results.
  3. Check if the extracted face is a deepfake with our proprietary software. More information on our product can be found here.

The beauty of this approach is that this pipeline does not require any manual checks. The different steps are all performed automatically in an end to end pipeline. There was no need to look at even a single thumbnail. Due to the sensitive nature of the data, we had agreed to do the analysis this way.


After steps 1 and 2, we had approximately 14% of the thumbnails left. 14% is not a lot, but these are still 25,000 thumbnails. These 25,000 thumbnails were used in step 3, and what we found surprised us.


In the 25,000 thumbnails, we found just 72 deepfakes. We expected to find a higher number of deepfakes, but for now, 72 of 25,000 is the ratio we found. Assuming we can interpolate this ratio 1 to 1 to the entire dataset, we expect around 490 deepfake videos in the celebrity category. Although we expected to find more, we believe this is 490 videos too many.


What did we do with these findings?

The current approach of adult content websites is to take videos offline when complaints are received or a take-down is requested. This is a reactive approach, where the damage is already done the moment the video is taken offline. We all know that once a video is on the web, it will never go away completely.


In the case of deepfake material, a proactive approach, such as pre-checking whether the material has been recorded consensually, is a better approach, and also very doable. Such an approach significantly limits the damage suffered by the victim and prevents adult content websites from spending time removing content afterwards. Unfortunately, we haven't seen adult content websites actively doing this yet.


These results were shared with our partner, and they took down the videos that we suspected were deepfake. 


Without clear legislation for adult content providers, we expect that the number of deepfake adult content videos on the internet will continue to increase. We hope this blog post raises awareness about this issue and we see it as a small step towards a world where women don't have to worry about their image ending up in a deepfake adult content video.


About DuckDuckGoose

DuckDuckGoose is a Delft-based startup making software for deepfake detection. With our software, we at DuckDuckGoose aim to realize a digital environment where we can still believe what we perceive. Want to know more? Click here to contact us!

Questions? Lets talk

Schedule a call with us. We're always happy to help!
Calendar and clock image