Artificially Exposed: The dangers of AI deepfake nudes

NOW: Artificially Exposed: The dangers of AI deepfake nudes
NEXT:

MILWAUKEE (CBS 58) -- There's no doubt artificial intelligence can be an incredibly powerful tool, if used for the right reasons. It can also, however, pose many dangers when it comes to privacy and public safety. In recent months, the threat of AI-generated pornography has gained a lot of attention.

"Right now, it's actually pretty hard to tell whether or not something has been AI-generated or AI manipulated versus, like, it being, like, authentic," said Dr. Michael Zimmer, the director of Center for Data, Ethics and Society at Marquette University.

The FBI has explicitly warned the public against this epidemic of grabbing a photo and digitally removing clothing from a person's body.

Dr. Zimmer said the concept of falsifying images isn't new. 

"People would take photos or cut and paste, or use Photoshop, or use other tools, you know, to try to pretend or embarrass people, or whatever it might be, but now, it's just so easy, it's free, it's fast, you don't need to know anything," he said.

Anyone in an instant can become a victim--even celebrities. Earlier this year Taylor Swift had deepfake pornographic images spreading rapidly for a week. It was viewed tens of millions of times before the platform 'X' took them down. 

"There's not much you can do to prevent someone from creating these kinds of images and sharing them online," added Zimmer. "The law is slowly catching up, I think, to the realities of these kinds of images; there are some laws out there about non-consensual, you know, sexual images, or personal images because often times you don't have a copyright claim, you may not even have a privacy claim 'cause if you're not a public figure, I can't claim that my reputation was hurt."

Zimmer also brought up a secondary concern of: where are these photos ultimately stored?

"Now those source photos become part of this database and it's another set of concerns about what kind of images are these companies collecting from users? And what's happening with those images?" he inquired.

There are many examples of average people--male and female--who've already been targeted, including minors. Zimmer said that it could even be considered child pornography.

This school year alone, cases in Washington state, New Jersey, California, and Florida made national headlines.

Experts say it's not just modern bullying, in some states like Wisconsin, it's a felony. 

"I think it's a real problem, not only the sexualization but also the privacy," said Rep. Adam Neylon.

Neylon is one of 14 members of the Wisconsin Assembly AI Task Force. He told CBS 58 News that there needs to be a balance between innovation and public safety.

"I think there are limits to free speech, I think when it comes to misleading the public or creating images that are exploitative and especially when it comes to children," he said.

In fact, two teenage boys from Miami, Florida, appear to be the first to ever be arrested and criminally charged for creating and sharing AI-generated nude images of classmates without consent.

As of late March, state lawmakers made it illegal to possess or share exploitative virtual imagery of children in Wisconsin. 2023 Wisconsin Act 224 titled 'Possession of Virtual Child Pornography,' requires a mandatory sentence of three years in prison, registration as a sex offender and a $500 fine for each image associated with the crime. 

Since last year, about two dozen states have introduced bills to deal with this epidemic. Experts say it'll only become more difficult to tell what is real and what is not. 

If you know of anyone participating in this kind of behavior, experts suggest immediately contacting law enforcement to report it.

Share this article: