US President Joe Biden’s AI executive order recommends, but does not require, companies to label AI-generated photos, videos and audio to indicate computer-generated work
Amid the controversy surrounding a reported morphed video of actor Rashmika Mandanna that went viral on the social media, a new report has claimed that artificial intelligence (AI) is fueling an unprecedented boom this year in fake pornographic images and videos.
The easy-to-use AI tools can “undress” people in photographs or seamlessly swap a face into a pornographic video, reports The Washington Post.
On the top 10 websites that host AI-generated porn photos, fake nudes have surged by more than 290 per cent since 2018, according to Genevieve Oh, an industry analyst.
AI-generated porn videos have also exploded across the web.
After scouring the 40 most popular websites for faked videos, Oh found more than 143,000 videos had been added in 2023. These sites feature celebrities and political figures such as New York Representative Alexandria Ocasio-Cortez alongside teenage girls.
“There’s no federal law governing deepfake porn, and only a handful of states have enacted regulations,” the report mentioned.
US President Joe Biden’s AI executive order recommends, but does not require, companies to label AI-generated photos, videos and audio to indicate computer-generated work.
Last week, reports surfaced that actor Scarlett Johansson was suing an app developer for using her name and AI-generated version of her voice in an online ad.
As reported by Variety, the 22-second ad showed Johansson behind the scenes while filming ‘Black Widow’, where she says “What’s up guys? It’s Scarlett and I want you to come with me.” Later, an AI-generated voice meant to sound like the actor takes over, saying “It’s not limited to avatars only. You can also create images with texts and even your AI videos. I think you shouldn’t miss it.”