MOSCOW, January 26 American singer Taylor Swift intends to sue the creators of a deepfake porn with her participation, writes the Daily Mail, citing its sources.
The publication claims the star is «livid» over images created by artificial intelligence and posted on the deepfake adult site Celeb Jihad.
«»Whether legal action will be taken is being decided, but one thing is clear: these fake AI-generated images are offensive, they are violent, they are exploitative and should be removed wherever they exist, and no one should advertise them.» said a source close to Swift.
The singer was depicted at a game of the Kansas City Chiefs football team, for which Taylor Swift's boyfriend Travis Kelsey plays.
Porn images went viral and began to spread in social networks X (formerly Twitter — ed.), Facebook*, Instagram* and Reddit. Thus, one of the posts in X collected more than 45 million views, 24 thousand reposts and hundreds of thousands of likes in seventeen hours.
Later, X posted an apology and assured that they were keeping the situation with the spread of deepfakes under control.
Lawyers and some legal experts in the United States are calling for federal regulation of the topic. They believe that victims should have uniform protection throughout the country, and criminals should be aware of responsibility.
* The activities of Meta (social networks Facebook and Instagram) are prohibited in Russia as extremist.
Свежие комментарии