Put Your AD here!

How To Protect the Property-Right in One’s Own Voice

How To Protect the Property-Right in One’s Own Voice


This article was originally published on NY Sun - Opinion. You can read the original article HERE

Actress Scarlett Johansson isn’t the only person whose voice has been used without permission. There have been deep fakes of business people authorizing fraudulent payments; union leaders urging members to vote against contracts they just negotiated; and even the President of the United States making calls about policy proposals contrary to his views.

AI-driven technologies are upending entire industries and threatening jobs. And because we are still in the early stages of development and application, the landscape looks like the Wild West. Companies are putting down stakes, often ignoring laws and conventions, to survive the inevitable shoot-outs.  The entertainment industry in particular already looks like the OK Corral. 

Performers who earn their living with their voices and likenesses are among the first to experience the impact. In the largely unregulated landscape, they are falling victim to the theft of their voices for digital replication without consent nor compensation. 

Duncan Crabtree-Ireland is the National Executive Director of the country’s largest union for entertainment and media artists, the Screen Actors Guild – American Federation of Television and Radio Artists. He was a key negotiator in the recent 118-day actors’ strike.  After the union settled with the film and TV studios and streamers, the draft agreement — which discussed the use of cloned voices and images — was sent to union members for ratification.

During the vote, a video of Mr. Crabtree-Ireland appeared online. Confusingly, it urged members to vote against the proposal. It was an impressive looking and sounding deepfake, which threatened to undermine the union’s breakthroughs in replication protection.

I was introduced to insidious non-consensual replicas when I represented a senior citizen who received a voicemail from his “son,” saying he was in trouble and needed $5,000 immediately. As the worried father was leaving the bank, his actual son called. They were able to stop the money transfer, but such scams are increasingly common and sophisticated.

Recently, we filed a class action lawsuit on behalf of actors whose voices, we allege, were stolen to teach an AI-driven tool and to clone their voices. The company, LOVO AI, sold its text-to-speech tool to unsuspecting customers for thousands of projects.

The lead plaintiffs are actors Paul Skye Lehrman and Linnea Sage. Mr. Lehrman heard his AI-clone’s voice being interviewed on a “Deadline Strike Talk” podcast, ironically about the use of AI in entertainment.

Yet Mr. Lehrman never recorded the show or consented to the use of his voice. The host asked questions, typed them into ChatGPT, and fed those answers into a text-to-speech generator — which answered in Mr. Lehrman’s cloned voice.

LOVO says it is pursuing a revolutionary business strategy that could eliminate actual actors from voice-over creation. Creators simply upload a script, select the desired voice and generate the soundtrack without dealing with — or paying — actors.

AI-generated voices will be used in a number of industries, among them entertainment, education, and marketing. The Screen Actors Guild, for one, recently announced terms for advertisers who want to use AI-generated voices to personalize digital audio commercials based on listener specifics like time, weather, and location. So, “Good afternoon” would replace “Good morning” at lunchtime. Similarly, there is an agreement in the works for AI to generate voices for chatbots. 

We don’t want to ban AI generated voices; we’re simply trying to protect the rights of artists ​​whose voices are being cloned or used to train AI. Change is inevitable, but it can be regulated thoughtfully. Explicit consent, control and transparency of intended uses, and fair compensation must be at the core of any agreement or legislation. 

This is, after all, a property rights question. Voice and likeness are legal property rights. While many states protect these rights from commercial exploitation, there are no uniform national standards. And there is no clear way to demand that platforms remove non-consensual digital fakes.

All of this could change dramatically and for the better if the just-introduced No Fakes Act passes Congress. It is a bipartisan proposal that would protect intellectual property rights in voice and likeness, protecting against unauthorized clones while balancing First Amendment rights.

The law would make it illegal to produce unauthorized digital replicas, and hold platforms liable for knowingly hosting unauthorized digital replicas. It wouldn’t stop criminals; for that, it is wise to have a secret question-and-answer that a scammer can’t answer. The No Fakes Act, though, would give victims a civil remedy.

Instead of allowing AI to run amok now — and later play whack-a-mole reining them in — Congress could pass the No Fakes Act. It is a smart start to a complicated problem.

This article was originally published by NY Sun - Opinion. We only curate news from sources that align with the core values of our intended conservative audience. If you like the news you read here we encourage you to utilize the original sources for even more great news and opinions you can trust!

Read Original Article HERE



YubNub Promo
Header Banner

Comments

  Contact Us
  • Postal Service
    YubNub Digital Media
    361 Patricia Drive
    New Smyrna Beach, FL 32168
  • E-mail
    admin@yubnub.digital
  Follow Us
  About

YubNub! It Means FREEDOM! The Freedom To Experience Your Daily News Intake Without All The Liberal Dribble And Leftist Lunacy!.


Our mission is to provide a healthy and uncensored news environment for conservative audiences that appreciate real, unfiltered news reporting. Our admin team has handpicked only the most reputable and reliable conservative sources that align with our core values.