X blocks searches for Taylor Swift following the circulation of explicit AI images


Social media platform X has taken action to block searches for Taylor Swift after explicit AI-generated images of the singer started circulating on the site. X’s head of business operations, Joe Benarroch, described this as a temporary measure to prioritize safety. When users search for Swift on the platform, they receive an error message. The fake graphic images went viral and were viewed millions of times, causing concern among US officials and fans. Swift’s supporters flagged the posts and accounts sharing the fake images and flooded the platform with real images and videos of her, using the hashtag “protect Taylor Swift”. X, formerly known as Twitter, released a statement stating that posting non-consensual nudity is strictly prohibited on their platform, and they are actively removing all identified images and taking appropriate action against the responsible accounts. It is unclear when X started blocking searches for Swift or if they have done so for other public figures in the past. The White House expressed alarm over the spread of AI-generated photos and called for legislation to address the misuse of AI technology on social media. US politicians have also called for new laws to criminalize the creation of deepfake images. In the UK, the sharing of deepfake pornography became illegal under the Online Safety Act in 2023.

You may also like...