Rep. Yvette Clarke (D-N.Y.) on Thursday blasted the online circulation of fake, sexually explicit images of Taylor Swift that were likely generated by artificial intelligence, calling on lawmakers from both parties to find a solution to an issue that affects women across the country.
“What’s happened to Taylor Swift is nothing new,” Clarke wrote on social media, as she requested action from politicians as well as the singer’s fans. “This is an issue both sides of the aisle & even Swifties should be able to come together to solve.”
The Democrat noted that advancements in technology like AI have made it easier for bad actors to create such seemingly realistic images, sometimes known as deepfakes.
One post with fake Swift images on X, the social platform formerly known as Twitter, garnered over 45 million views before it was removed about 17 hours later, The Verge reported, even though the company has rules banning this type of content.
X issued a statement early Friday saying that it was monitoring its site for further violations, without directly mentioning Swift.
“Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content,” the company wrote on its @Safety account. “Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them.”
Read the full article here