A website DEEPFAKE that uses AI to create fake “nude” photos and revenge pornography is said to be spreading its abusive content throughout the site.
According to a WIRED investigation, the dark web is creating new services and recruiting new users through a referral system.
WIRED did not name the site but said it was growing despite a ban it accepted payment for its ‘services’.
The site uses artificial intelligence to create “surreal” erotic images of women.
All the AI needs is an image of a fully clothed woman, and it can generate a fake nude photo.
The site is said to be raking in thousands of dollars for its creators, and it is said to be expanding by allowing “partners” to use its algorithms.
At least two sub-sites have been created.
WIRED says the site has a “partner program” page, which explains plans to offer customers alternative payment methods and versions of its content in different languages.
It also says having a decentralized model will help avoid “sudden service outages, even terminations.”
Its approach appears to have helped the site to be taken down.
According to this website, hundreds of thousands of images were once uploaded in a single day.
Following media attention earlier this year, the site has faced restrictions and Coinbase appears to have suspended its payments account.
Visitors to the site have dropped significantly but there are concerns that its new partners are helping to keep its abused technology accessible.
The creators of the sub-sites are reportedly paying $500 to the original site for every 10,000 nude images they create with the software.
Popular social media platforms are also credited with helping the site grow again as links to “free image generator tokens” have been discovered on Twitter and YouTube.
As the site continues to grow, its victims are increasingly concerned.
Many US states have laws against revenge porn, and so do in the UK.
The devastating impact that revenge porn can have on victims is well known.
However, US and UK revenge porn laws currently do not cover deep-rooted scams.
In other news, Facebook Messenger and Instagram Users can now play Heads Upward!
Snapchat gave away a whopping $250m (£189m) to its creators in the last year.
And, Apple will scan iPhone messages for nudity in an effort to crack down on child abuse.
We pay for your stories!
Do you have a story for the US team The Sun?
https://www.thesun.co.uk/tech/17051829/deepfake-nude-photo-website-warning/ Warning when an abusive deepfake site using AI to create fake ‘nudity’ is caught spreading on the web