The recurring story of new technologies is one of accidental consequences. Let’s take AI-powered symbol generators as an example. Its creators claim that they decorate the human mind’s eye and turn everyone into artists, but from time to time they mention how much they also contribute to the creation of fake and illicit pornography. A lot.
Over the weekend, X had to stop searching for “Taylor Swift” because the site formerly known as Twitter had been flooded with so many fake pornographic photographs of the singer that he couldn’t see them all.
One image alone was viewed more than 45 million times before being taken down. Swift’s scandal points to a broader problem. Around 96 per cent of deepfakes on the web are pornographic. But it could also be the final tipping point before some genuine solutions are introduced.
Taylor Swift is reportedly suing a deepfake site that posted some of ellos. PA
In January alone, enough has happened to show that, in the absence of proper regulation, the harms of generative AI are starting to outweigh the benefits. Generation is being used in more scams and bank fraud, worsening Google search effects, and misleading the electorate with fake robocalls from US President Joe Biden.
But the Swift attack shows where the poisonous effects of generative AI are at their most insidious, creating entirely new teams of victims and abusers in a marketplace of unauthorized sexualized photographs. They point to more discreet but no less harmful tactics in which generative AI undermines women’s dignity. , for example, by generating sexualized photographs by default, the worst scenario for women of color.
Deepfakes are the problem, and until Swift, they went unnoticed. Over the past year, the school’s top students have used real photographs of their classmates to create deepfake pornography. In a small town in Spain, a group of boys used artificial intelligence equipment to digitally “undress” the social media photos of more than 20 women between the ages of 11 and 17, before posting them on WhatsApp and Telegram.
Fake porn has been possible for more than two decades thanks to software like Photoshop, but only now has it become so quick and easy for anyone to produce, with different apps making it possible to swap one person’s face onto another body, for instance.
But more and more people and governments are waking up to the fact that the latest victim is Swift, Time magazine’s Person of the Year, who helped raise 0. 5 percent of U. S. GDP, went to war with streaming corporations like Apple Inc and won, and boosted soccer’s female audience. “We are alarmed by reports of false images,” White House press secretary Karine Jean-Pierre said Friday. “We will do everything we can to solve this problem. “
Lawmakers are angry, while legions of Swift enthusiasts have noticed that the word “protect Taylor Swift” is trending on X, and some of them have taken justice into their own hands, digging up a Twitter user who allegedly shows many of the illicit images. taking legal action against a deepfake site that posted some of them, according to the Daily Mail.
Swift isn’t one to do things in half measures, so we may see more than just a lawsuit. Perhaps she will put her weight behind some of the bills already making their way through Congress that tackle unauthorised deepfakes. One bill makes it illegal to create and share such images, while another proposes five years in prison for perpetrators, as well as legal recourse for victims.
It’s easy to shrug your shoulders and pretend the cat is out of the bag. Teams have proliferated; some are open source and as social networks like Twitter have reduced their uptake and security teams, it’s very conceivable to make them viral.
When Microsoft CEO Satya Nadella was recently asked about Swift’s deepfakes and jumped straight into platitudes about “guardrails” — rather than making explicit policy recommendations — it’s possibly due to the fact that his company is at the center of the burgeoning generative AI sector out there. At least some of Swift’s images were also created in a Microsoft symbol generator, according to 404 Media.
There is hope for a solution. Some of the measures before Congress are a start, and while long-term regulations are still being drafted, the government can for now take charge of the situation by setting an example for some of the worst perpetrators of crimes.
Deterrents can work, even for others who think they can hide anonymity online. A clever example is the online hacktivist organization Anonymous, whose activities shut down almost without delay after a handful of its best-known hackers were arrested and named about a decade ago. One Twitter user admitted in the past to posting some of the first photographs that went viral, saying, “Bro, what have I done. . . They could pass new legislation because of my post about Taylor Swift,” before her account becomes private, according to Newsweek.
Long before Swift became a victim, many young women who didn’t have the same kind of influence were experiencing the mental misery of being attacked. They had to pick up the pieces after seeing their reputation humiliatingly tarnished online and suffering the long-term consequences. aftermath.
When unauthorized pornography and deepfakes aren’t a joke: it’s a form of virtual sexual violence that threatens to fuel a broader culture of online misogyny and abuse. Perhaps more than any other woman on earth, Swift has the clout to put an end to this situation. . We hope you make the most of this opportunity.
Bloomberg Review
Keep track of topics, people, and what matters to you.
Retrieving the Items
The Daily Habit of Successful People