Put Your AD here!

California Seeks to Block Deep-Fake Nudes

California Seeks to Block Deep-Fake Nudes


This article was originally published on Hot Air. You can read the original article HERE

The latest in a flurry of new legislation being signed into law in California has made it past Governor Gavin Newsom's desk prior to the end of this legislative session. At first glance, this new measure might seem harmless or even potentially helpful. Newsom once again has Artificial Intelligence in his cross-hairs, this time seeking to ban AI-generated nude images, particularly of subjects who appear to be minors. Banning the creation and dissemination of "harmful sexual imagery of children" is the stated purpose and few reasonable people would argue against it. However, like similar legislation that came before this, the law raises many questions and takes us into uncharted legal waters. (Associated Press)

Advertisement

California Gov. Gavin Newsom signed a pair of proposals Sunday aiming to help shield minors from the increasingly prevalent misuse of artificial intelligence tools to generate harmful sexual imagery of children.

The measures are part of California’s concerted efforts to ramp up regulations around the marquee industry that is increasingly affecting the daily lives of Americans but has had little to no oversight in the United States.

Earlier this month, Newsom also has signed off on some of the toughest laws to tackle election deepfakes, though the laws are being challenged in court. California is wildly seen as a potential leader in regulating the AI industry in the U.S.

Regulating the Artificial intelligence industry is quickly turning out to be something that is far easier said than done. As noted above, Newsom already signed off on a law that would ban election deepfakes, but that law quickly became tied up in knots in the courts. It might be easy to assume that banning naked images of children would be a far easier lift to pull off, and it truly should be. Nobody wants to see minors being sexually exploited or have other children exposed to such imagery. But that's where the complications begin to pile up.

In order to prove a crime has taken place and prosecute someone for it, you first must establish a perpetrator, a victim, and the crime that is being alleged. If a deepfake nude of an individual who appears to be a minor shows up in someone's inbox, who was the child that was exploited in that fashion? Was the victim the actual child who was exposed to the image? And who was responsible for "creating" the image.

Advertisement

In traditional child pornography cases, most of those questions are easy enough to answer. The child who was photographed was being exploited and the child who viewed the image was one of the harmed parties. The person who took the original photos and published them is responsible. In these AI images, we can almost identify the creator because someone had to instruct the algorithm to create an image matching particular specifications. The AI doesn't just sit around randomly generating racy images. But feeding instructions into an algorithm isn't the same as taking a picture. 

Far more to the point, those are not images of actual children. There never was a child involved in the process. The bot is simply taking a vast library of similar images and stitching pieces together to create something entirely new and unique. The images represent a fictional being comprised of ones and zeros. As to the actual children being exposed, they are looking at what is essentially a cartoon, albeit a shockingly realistic one. The courts have already begun to weigh in on the subject and determined that such fictional characters have no rights under the law and the algorithm itself cannot be held legally liable for following the instructions that it's given to begin with.

To be clear, I'm not making an argument here in favor of kiddie porn. I'm simply pointing out that we are venturing into issues that our legal system does not yet seem prepared to address. In order to prosecute kiddie porn, you need a kid and you need evidence that passes muster as pornography. Modern AI technology is leading us down previously uncharted paths at a breakneck pace. It may take the legal system a long time to catch up, assuming it can do so at all.

Advertisement

This article was originally published by Hot Air. We only curate news from sources that align with the core values of our intended conservative audience. If you like the news you read here we encourage you to utilize the original sources for even more great news and opinions you can trust!

Read Original Article HERE



YubNub Promo
Header Banner

Comments

  Contact Us
  • Postal Service
    YubNub Digital Media
    361 Patricia Drive
    New Smyrna Beach, FL 32168
  • E-mail
    admin@yubnub.digital
  Follow Us
  About

YubNub! It Means FREEDOM! The Freedom To Experience Your Daily News Intake Without All The Liberal Dribble And Leftist Lunacy!.


Our mission is to provide a healthy and uncensored news environment for conservative audiences that appreciate real, unfiltered news reporting. Our admin team has handpicked only the most reputable and reliable conservative sources that align with our core values.