The popular social network X, formerly known as Twitter and owned by Elon Musk, has taken a significant step to combat the spread of fake explicit images of Taylor Swift. Users searching for “Taylor Swift” on X will now encounter an error message that says, “Something went wrong. Try reloading.” According to X’s head of business operations, Joe Benarroch, this action is a temporary measure taken to prioritize user safety.

The decision to block Taylor Swift-related searches comes in response to the recent viral circulation of sexually explicit AI-generated images of the artist on X and other internet platforms. The situation prompted SAG-AFTRA, a union representing performers, to issue a statement condemning the images as “upsetting, harmful, and deeply concerning.” The union also called for the creation of legislation that would make the development and dissemination of such fake images illegal.

This issue even caught the attention of the White House. When asked if President Biden would support legislation against AI-generated porn, White House press secretary Karine Jean-Pierre responded, “We are alarmed by the reports of the circulation of images that you just laid out… There should be legislation, obviously, to deal with this issue.”

In response to the controversy, X’s Safety team released a statement emphasizing their commitment to removing nonconsensual nudity from the platform. They stated that posting such images is strictly prohibited and that they maintain a zero-tolerance policy towards such content. X’s Safety team assured users that they are actively removing identified images and taking appropriate action against the accounts responsible.

F5 Magazine supports X’s efforts to protect user safety and privacy. We believe in the importance of maintaining a safe and respectful online environment for all users. To read more about this news, please visit F5mag.com.

Source: Variety

By f5mag

Leave a Reply

Your email address will not be published. Required fields are marked *