X has shut down searches for Taylor Swift as it works to remove all AI-generated nude images of her from the platform.
Searches for “Taylor Swift” or “Taylor Swift AI” currently return a message that says “Something went wrong. Try reloading.” Fans can still search for “Taylor Swift images.” However, the results only included innocent photos of the singer.
The shutdown comes after Swift deepfakes hit X on January 24. The explicit AI-generated photos first appeared on Celeb Jihad and quickly spread to other social media sites as well. Taylor Swift is not the first celebrity to fall victim to the hoax, which has raised concerns from SAG-AFTRA and the White House.
“We are alarmed by reports of the circulation of images that you just exposed, fake images to be exact, and it is alarming,” White House press secretary Karine Jean-Pierre said during a press conference. .
He continued: “While social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation and non-consensual intimate images of real people”.
SAG-AFTRA urges lawmakers to prevent Taylor Swift and others from falling victim to more deepfakes
According Deadline, people viewed Taylor Swift’s X deepfakes more than 27 million times and got more than 260,000 likes. The platform closed the account that published them within 19 hours.
Once the news hit, Swift’s fans began flooding social media with “Taylor Swift AI” attached to positive images and messages in an attempt to remove the nude images. Fans also created the new trending hashtag “Protect Taylor Swift.”
“The development and dissemination of false images, especially those of a lewd nature, without someone’s consent should be considered illegal,” SAG-AFTRA wrote in a statement. “As a society, we have it in our power to control these technologies, but we must act now before it is too late.”
“SAG-AFTRA continues to support Congressman Joe Morelle’s legislation, the Intimate Images Deepfake Prevention Act, to ensure we prevent exploitation of this nature from happening again. “We stand with Taylor and women around the world who are victims of this type of theft of their privacy and right to autonomy.”