After several sexually explicit AI-generated images featuring Taylor Swift’s face went viral online, the White House has issued a comment.
X/Twitter spent most of last week hiding searches for Taylor Swift as it worked to combat the viral spread of the images. The White House Press Secretary spoke about the issue on Friday, telling ABC News that what can be accomplished with AI tech is ‘alarming.’
“We are alarmed by the reports of the circulation of images that you just laid out—false images to be more exact—and it is alarming,” Karine Jean-Pierre told ABC News. “While social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation, and non-consensual, intimate imagery of real people,” she continued.
While there are content moderation policies in place on social media sites, X/Twitter’s Safety account re-iterated its policies without naming Taylor Swift.
Many Swift fans posted cute images and admiration of the singer in an effort to drown out reposts of the AI-generated deepfake nudes. The incident also has Congress discussing a bill to make non-consensual sharing of digitally-altered explicit images a federal crime.
The bipartisan “Preventing Deepfakes of Intimate Images Act” was introduced by Rep. Joe Morelle (D-NY) in 2023 and is currently referred to the House Committee on the Judiciary. “We’re certainly hopeful the Taylor Swift news will help spark momentum and grow support for our bill, which as you know, would address her exact situation with both criminal and civil penalties,” a Morelle spokesperson told ABC.
The proliferation of AI images and image generation has made it easier than ever to create AI-assisted deepfake mash-ups of whatever concept you want. The Taylor Swift nudes creators used Microsoft Designer—a tool intended to make social media posts easier to create.