680,000 people ‘stripped’ nude by Telegram bot
Image credit: Dreamstime
Visual threat intelligence company Sensity has reported a deepfake ecosystem on the Telegram messaging app, based around the rapid generation and distribution of AI-synthesised nude images. At least 680,000 victims – including minors – have been targeted.
The nude images are generated using an AI-powered bot. The bot is likely to be based on an open-source version of DeepNude software, which uses deep learning – specifically generative adversarial networks – to generate nude versions of photographs of clothed women.
A basic version of DeepNude was launched in June 2019 and met with both enthusiasm and condemnation. Less than a month later, the DeepNude licence was sold to an anonymous buyer for $30,000. The software has since been reverse-engineered and can be found in “enhanced forms” on open-source repositories and torrenting websites.
Running DeepNude would normally require users to have access to a computer with a GPU, but the bot is powered by external servers. The bot also provides a simple interface – with the appearance of a standard instant messaging app – which allows anyone to upload a photograph of a target and receive a “stripped image” after a quick generation process. These changes significantly lower the barrier to access, allowing people with no expertise to generate photorealistic nude images using just a smartphone.
While the bot is free to use, users can pay for 'premium coins' (starting price RUB100 for 12 coins) which remove watermarks and allow them to skip the processing queue.
At present, the bot can only generate stripped images of feminine bodies.
Sensity found that once an attacker has a stripped image of their target, they may download it, share it on social media, forward it to the target’s friends and family, or forward it to the target for the purposes of blackmail. They may also share the personal information of the target on underground marketplaces.
Self-reporting from bot users indicated that 70 per cent of targets are private individuals whose photos are taken from social media accounts or private communication. While celebrities are the primary target regarding pornographic deepfake videos, celebrities accounted for just 16 per cent of those targeted by the bot. These results were confirmed via Sensity analysis of the images shared on image sharing channels, which mostly appeared to be taken from social media.
A limited number of stripped images appeared to feature minors.
The bot and its seven affiliated Telegram channels have attracted over 100,000 members worldwide, with 70 per cent from Russia and the rest of the former USSR. The bot also has a significant presence on VK, Russia’s largest social media platform.
Sensity estimated that at least 104,852 women had their 'stripped' images shared publicly by the end of July 2020, the number of these images being shared growing by 198 per cent in the last three months until July. However, one day before the publication of the report, Sensity discovered a new page advertising the bot which showed that more than 680,000 people have been targeted.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.