Fake explicit pictures of Taylor Swift cause concern over lack of AI regulation
MILWAUKEE (CBS 58) -- The White House is calling on Congress to take legislative action after fake explicit photos of singer Taylor Swift were created using artificial intelligence.
Intelligence experts say legal protections against these images are far and few.
If a picture is worth a thousand words, generative artificial intelligence only needs a few to spit out an image.
“At this point, fairly hard to avoid being impacted by AI," said Derek Riley, the computer science program director at Milwaukee School of Engineering. “Generative AI can really make major changes to what we perceive in the media, what we hear, what we read.”
The rapidly growing technology can create images that never happened.
“There are tools available online that really a novice user can use to generate fake images like that," said Riley.
It's as simple as finding a free AI generator and typing some key words into a search box. Riley says this type of technology can be used for dangerous content.
“Generative AI can be used for generating deepfakes for politicians, for pornography, and all other sorts of malicious cases," said Riley.
Deepfakes are when a person's voice or image is digitally changed and intended to cause harm.
“Gone are the days that we can trust that because we see an image, it happened," said Riley.
Swift is one of the most recent victims of this act. The global superstar had explicit images created in her likeness that spread across social media platforms, including X.
“You don’t need a technical background to generate a really convincing deepfake," said Riley.
Riley says there are only a few states with regulations against AI -- Wisconsin is not one of them.
“Our government doesn’t move fast enough to keep up with the changes that are happening, so I think it’s relatively unlikely we’ll see effective regulations," he said.
While some people, including Swift, can consider legal action against deepfakes -- most victims will have to keep waiting.
"Some of these deep fakes are going to cause anxiety for folks and other consequences," said Riley.