A bot operating on the social media platform Telegram has generated nude images of more than 100,000 real women, according to a new report.
The Amsterdam-based intelligence company Sensity uncovered the free-to-use bot which has been operating since July 2019.
Using artificial intelligence, the technology allows users to upload a picture of a woman with her clothes on and after a short wait, the bot will return a manipulated version of the image, made to look as if the subject has been stripped naked.
The report said these "deepfake" images can be downloaded and therefore shared in private or public channels outside of Telegram, meaning they can be used to extort or publicly shame their target.
The bot only works on women.
Deepfake technology is not new. But Sensity’s chief executive Giorgio Patrini said the Telegram bot is significant because it’s so easy to use.
"All users need to do here is find these groups, know the keywords, then simply upload one single picture to have the bot do its job," he said.
Telegram did not immediately reply to The Telegraph’s request for comment and the bot is still available on the site.
"Telegram’s Terms of Service fails to mention what kind of content the bots available on Telegram are allowed to upload or prohibited from distributing," said Ksenia Bakina, Legal Officer, Privacy International. "This suggests that the deepfake nudes bot isn’t breaking Telegram’s Terms of Service and that’s part of the reason why Telegram has failed to delete it."
Experts believe the technology which has been embedded on the network can be traced back to the DeepNude app which also used AI to undress images of women.
After public outrage, the creator took the app offline in June 2019, saying in a Twitter statement: "The world is not yet ready for DeepNude".
However Nina Schick, author of Deep Fakes and the Infocalypse, said last year, the developers quietly sold the machine learning system in an anonymous auction for $30,000.
"It’s obvious then that know-how has now leaked and been repurposed," she said.
She added the bot is so harmful because it can be used against any women — including minors.
"Any woman can be a target because who doesn’t have a digital footprint online either through social media, their friends’ or family’s social media, or through work," she said.
The vast majority of the bot’s users are from Russia and former USSR countries, with 3pc from English-speaking countries such as the UK and the US.
Even though the naked images created by the bot are often obviously manipulated, this hasn’t affected the demand, said Sensity’s Patrini.
"From the point of view of the victims, that doesn’t really matter," he said. "Seeing a photo of yourself naked in the situation where you didn’t take photo is quite threatening and quite shocking for the victim in spite of the quality."
Although there has been much discussion about the potential affect of deepfakes on politics, when Sensity—formerly known as DeepTrace Labs— surveyed 14,000 deepfake videos online last year, the company found 96pc featured pornography; all of it targeting women.
Свежие комментарии