Taylor Swift AI Images Prompt Statements From White House, SAG-AFTRA

After explicit AI-generated images of Taylor Swift were shared on X last week, several bodies expressed the need for legislation surrounding AI, which, considering the rampant increase in AI in general over the past year, should already have been an issue on the table before it impacted the most famous woman in the world.

What happened? On Wednesday, sexually explicit deepfake images of Swift began circulating on X, amassing tens of millions of views in over 17 hours before the account that posted them was suspended. After the images began circulating, Swift's fans mass-reported the account that had posted them and flooded X with their own photos in an effort to bury the deepfakes in search results, getting PROTECT TAYLOR SWIFT to trend on the platform.

X has a policy banning the sharing of "synthetic, manipulated, or out-of-context media that may deceive or confuse people and lead to harm," but somehow the images of Swift remained on the platform for nearly a full day. On Saturday, Swift's name was no longer searchable on X, though other queries, including "Taylor AND Swift," were able to be used. "This is a temporary action and done with an abundance of caution as we prioritize safety on this issue," X said in a statement on Saturday.

Though X and other platforms are attempting to scrub the images from their servers, that doesn't mean they're gone forever, because the internet is the internet, and new images were circulating even as the originals were being taken down. 

How did this happen? A report by 404 Media found that the deepfakes appeared to originate from a group on the app Telegram that is dedicated to making AI-generated non-consensual explicit images of women, abhorrent in and of itself. The group seemingly uses Microsoft's Designer to create said images, and while they violate Microsoft's policy, the program can still create them.

Microsoft CEO Satya Nadella said "we have to act" in an interview with NBC News after the deepfakes of Swift went viral, adding, "I think we all benefit when the online world is a safe world. And so I don’t think anyone would want an online world that is completely not safe for both content creators and content consumers. So therefore, I think it behooves us to move fast on this."

What they're saying: SAG-AFTRA released a statement on Friday, calling the images "upsetting, harmful and deeply concerning."

“The development and dissemination of fake images — especially those of a lewd nature — without someone’s consent must be made illegal,” the union said. “As a society, we have it in our power to control these technologies, but we must act now before it is too late.”

At a White House press briefing on Friday, White House press secretary Karine Jean-Pierre responded when asked if President Biden supported legislation to make AI-generated porn images illegal.

“It is alarming,” Jean-Pierre said. “We are alarmed by the reports of the circulation of images that you just laid out… There should be legislation, obviously, to deal with this issue.”

Previous
Previous

This Week’s Disruptors 2/16/24

Next
Next

Morgan Wallen Criticizes Former Label, Managers for Re-Releasing Old Music Without His Permission