Deepfakes have doubled, overwhelmingly targeting women

OK, let's pull back from the nail-biting, perhaps hyperbolic, definitely hyperventilating, supposed threats to politicians and focus on who's really being victimized.

Unsurprisingly enough, according to a new report, that would be .

96% of the being created in the first half of the year were pornography, mostly being nonconsensual, mostly casting celebrities without compensation to the actors, let alone their permission.

The report, titled The State of Deepfakes, was issued last month by Deeptrace: an Amsterdam-based company that uses deep learning and computer vision for detecting and monitoring deepfakes and which says its mission is “to protect individuals and organizations from the damaging impacts of AI-generated synthetic media.”

According to Deeptrace, the number of deepfake videos almost over the seven months leading up to July 2019, to 14,678. The growth is supported by the increased commodification of tools and services that enable non-experts to churn out deepfakes.

One recent example was DeepNude, an app that used a family of dueling computer programs known as generative adversarial networks (GANs): machine learning systems that pit neural networks against each other in order to generate convincing photos of people who don't exist. DeepNude not only advanced the technology, it also put it into an app that anybody could use to strip off (mostly women's) clothes so as to generate a deepfake nudie within 30 seconds.

We saw another faceswapping app, Zao, rocket to the top of China's app stores last month, sparking a privacy backlash and just as quickly getting itself banned from China's top messaging app service, WeChat.

While Deeptrace says most deepfakes are coming from English-speaking countries, it says it's not surprising that it's seeing “a significant contribution to the creation and use of synthetic media tools” from web users in China and South Korea.

Deeptrace says that non-consensual deepfake pornography accounted for 96% of the total number of deepfake videos online. Since February 2018 when the first porn deepfake site was registered, the top four deepfake porn sites received more than 134 million views on videos targeting hundreds of female celebrities worldwide, the firm said. That illustrates what will surprise approximately 0% of people: that deepfake porn has a healthy market.

History lesson

As Deeptrace tells it, the term ‘deepfake' was first coined by the Reddit user u/deepfakes, who created a Reddit forum of the same name on 2 November 2017. This forum was dedicated to the creation and use of deep learning software for synthetically faceswapping female celebrities into pornographic videos.

Reddit banned /r/Deepfakes in February 2018 – along with Pornhub and Twitter – but the faceswap source code, having been donated to the open-source community and uploaded on GitHub, seeded multiple project forks, with programmers continually improving quality, efficiency, and usability of new code libraries.

Since then, we've seen faceswapping apps as well as one app for synthetic voice cloning (and one business get scammed by a deepfake CEO voice that talked an underling into a fraudulent $243,000 transfer).

Most of the apps require programming ability, plus a powerful graphics processor to operate effectively. Even here, though, the technology is growing more accessible, with several detailed tutorials being created with step-by-step guides for using the most popular deepfake apps, and recent updates having improved the accessibility of several GUIs.

Deeptrace says there are now also service portals for generating and selling custom deepfakes. In most cases, customers have to upload photos or videos of their chosen subjects for deepfake generation. One service portal Deeptrace identified required 250 photos of the target subject and two days of processing to generate the deepfake. The prices of the services vary, depending on the quality and duration of the video requested, but can cost as little as $2.99 per deepfake video generated, Deeptrace says.

The DeepNude app got pushed offline and has actually turned into a case study when it comes to deepfake service portals. In spite of the authors saying that they'd “greatly underestimated the volume of download requests” and crying out that “the world is not ready for DeepNude,” the world showed that it was actually hot-as-a-hornet ready.

The open-source code was subsequently cracked, independently repackaged and distributed through various online channels, such as open-source repositories and torrenting websites, and has spawned the opening of two new service portals offering allegedly improved versions of the original DeepNude. Charges range from $1 per photo to $20 for a month's unlimited access.

Oh, I guess the world is ready for DeepNudes, said the original creators, who were also ready to line their pockets, given that they put DeepNude up for sale on 19 July 2019 for $30,000 via an online business marketplace, where it sold to an anonymous buyer.

Well, yikes, Deeptrace said. That was a disaster in the making – at least to women, if not for the $30K richer DeepNude creators:

The moment DeepNude was made available to download it was out of the creators' control, and is now highly difficult to remove from circulation. The software will likely continue to spread and mutate like a virus, making a popular tool for creating non-consensual deepfake pornography of women easily accessible and difficult to counter.

Verified deepfakes include an art project that turned Facebook CEO Mark Zuckerberg into Mark Zucker-borg: the CEO's evil deepfake twin who implied that he's in total control of billions of people's stolen data and ready to control the future.

As well, we've seen enhanced fake digital identities used in fraud, infiltration and espionage.

Besides the voice deepfake, we've also seen LinkedIn deepfake personas: one such was ”Katie Jones”, an extremely well-connected redhead and purportedly a Russia and Eurasia Fellow at the top think-tank Center for Strategic and International Studies (CSIS) who was eager to add you to her professional network of people to spy on.

You might also like

Comments are closed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. AcceptRead More