KnowProSE.com on Nostr: By now, the[ news that Scarlett Johansson’s issues with OpenAI and the voice that ...
By now, the[ news that Scarlett Johansson’s issues with OpenAI and the voice that sounds like her](https://www.theverge.com/2024/5/20/24161253/scarlett-johansson-openai-altman-legal-action ) have made the rounds. She’s well known and regardless of one’s interests, she’s likely to pop up in various contexts. However, she’s not the first.
While different in some ways,[ voice actors Paul Skye Lehrman and Linnea Sage are suing Lovo for similar reasons](https://edition.cnn.com/2024/05/17/tech/voice-actors-ai-lawsuit-lovo/index.html?Date=20240518&Profile=cnn&utm_content=1716000674 ). They got hired to do some work that they thought were one off voice overs, then heard their voices saying things they had never said. To the point, they heard their voices doing something that they didn’t get paid for.
The way they found out was oddly poetic.
> Last summer, as they drove to a doctor’s appointment near their home in Manhattan, Paul Skye Lehrman and Linnea Sage listened to a podcast about the rise of artificial intelligence and the threat it posed to the livelihoods of writers, actors and other entertainment professionals.
> The topic was particularly important to the young married couple. They made their living as voice actors, and A.I. technologies were beginning to generate voices that sounded like the real thing.
> But the > [> podcast](https://podcasts.apple.com/us/podcast/week-seven-with-guest-poe-aka-ai/id1686019933?i=1000617737521 )> had an unexpected twist. To underline the threat from A.I., the host conducted a lengthy interview with a talking chatbot named Poe. It sounded just like Mr. Lehrman.
> “He was interviewing my voice about the dangers of A.I. and the harms it might have on the entertainment industry,” Mr. Lehrman said. “We pulled the car over and sat there in absolute disbelief, trying to figure out what just happened and what we should do.”<a href="https://www.nytimes.com/2024/05/16/technology/ai-voice-clone-lawsuit.html">What Do You Do When A.I. Takes Your Voice?</a>, Cade Metz, New York Times, May 16th, 2024.
They aren’t sex symbols like Scarlett Johansson. They weren’t the highest paid actresses in 2018 and 2019. They aren’t *seen* in major films. Their problem is just as real, just as audible, but not quite as visible. Forbes covered the problems voice actors faced in October of 2023.
> …Clark, who has voiced more than 100 video game characters and dozens of commercials, said she interpreted the video as a joke, but was concerned her client might see it and think she had participated in it — which could be a violation of her contract, she said.
> “Not only can this get us into a lot of trouble if people think we said [these things], but it’s also, frankly, very violating to hear yourself speak when it isn’t really you,” she wrote in an email to ElevenLabs that was reviewed by > *Forbes*> . She asked the startup to take down the uploaded audio clip and prevent future cloning of her voice, but the company said it hadn’t determined that the clip was made with its technology. It said it would only take immediate action if the clip was “hate speech or defamatory,” and stated it wasn’t responsible for any violation of copyright. The company never followed up or took any action.
> “It sucks that we have no personal ownership of our voices. All we can do is kind of wag our finger at the situation,” Clark told > *Forbes*> …‘<a href="https://www.forbes.com/sites/rashishrivastava/2023/10/09/keep-your-paws-off-my-voice-voice-actors-worry-generative-ai-will-steal-their-livelihoods/?sh=47aa19717b27">Keep Your Paws Off My Voice’: Voice Actors Worry Generative AI Will Steal Their Livelihoods</a>, Rashi Shrivastava, Forbes.com, October 9th, 2023.
As you can see – the whole issue is not new. It just became more famous because of a more famous face, and involves OpenAI, a company that has more questions about their training data than ChatGPT can answer, so the story has sung from rooftops.
Meanwhile, [some are trying to license the voices of dead actors](https://aibusiness.com/nlp/ai-allows-dead-actors-to-narrate-audiobooks ).
[Sony recently warned AI companies](https://www.theverge.com/2024/5/17/24158887/sony-music-ai-training-letter ) about unauthorized use of the content they own, but when one’s content is necessarily public, how do you do that?
[How much of what you post, from writing to pictures to voices in podcasts and family videos, can you control](https://knowprose.com/2024/04/20/paying-to-whitewash-the-fence-of-ai/ )? It costs nothing, but it costs futures of individuals. And when it comes to training models, these AI companies are eroding the very trust they need from those that they want to sell their product to – unless they’re just enabling talentless and incapable hacks to take over jobs that talented and capable people have already do.
We have more questions than answers, and the trust erodes as more and more people are impacted.
https://knowprose.com/2024/05/22/beyond-a-widowed-voice/
#AI #artificialIntelligence #deepfake #openai #ScarlettJohansson #socialMedia #socialmedia #society #Technology #voiceActor #voiceover
While different in some ways,[ voice actors Paul Skye Lehrman and Linnea Sage are suing Lovo for similar reasons](https://edition.cnn.com/2024/05/17/tech/voice-actors-ai-lawsuit-lovo/index.html?Date=20240518&Profile=cnn&utm_content=1716000674 ). They got hired to do some work that they thought were one off voice overs, then heard their voices saying things they had never said. To the point, they heard their voices doing something that they didn’t get paid for.
The way they found out was oddly poetic.
> Last summer, as they drove to a doctor’s appointment near their home in Manhattan, Paul Skye Lehrman and Linnea Sage listened to a podcast about the rise of artificial intelligence and the threat it posed to the livelihoods of writers, actors and other entertainment professionals.
> The topic was particularly important to the young married couple. They made their living as voice actors, and A.I. technologies were beginning to generate voices that sounded like the real thing.
> But the > [> podcast](https://podcasts.apple.com/us/podcast/week-seven-with-guest-poe-aka-ai/id1686019933?i=1000617737521 )> had an unexpected twist. To underline the threat from A.I., the host conducted a lengthy interview with a talking chatbot named Poe. It sounded just like Mr. Lehrman.
> “He was interviewing my voice about the dangers of A.I. and the harms it might have on the entertainment industry,” Mr. Lehrman said. “We pulled the car over and sat there in absolute disbelief, trying to figure out what just happened and what we should do.”<a href="https://www.nytimes.com/2024/05/16/technology/ai-voice-clone-lawsuit.html">What Do You Do When A.I. Takes Your Voice?</a>, Cade Metz, New York Times, May 16th, 2024.
They aren’t sex symbols like Scarlett Johansson. They weren’t the highest paid actresses in 2018 and 2019. They aren’t *seen* in major films. Their problem is just as real, just as audible, but not quite as visible. Forbes covered the problems voice actors faced in October of 2023.
> …Clark, who has voiced more than 100 video game characters and dozens of commercials, said she interpreted the video as a joke, but was concerned her client might see it and think she had participated in it — which could be a violation of her contract, she said.
> “Not only can this get us into a lot of trouble if people think we said [these things], but it’s also, frankly, very violating to hear yourself speak when it isn’t really you,” she wrote in an email to ElevenLabs that was reviewed by > *Forbes*> . She asked the startup to take down the uploaded audio clip and prevent future cloning of her voice, but the company said it hadn’t determined that the clip was made with its technology. It said it would only take immediate action if the clip was “hate speech or defamatory,” and stated it wasn’t responsible for any violation of copyright. The company never followed up or took any action.
> “It sucks that we have no personal ownership of our voices. All we can do is kind of wag our finger at the situation,” Clark told > *Forbes*> …‘<a href="https://www.forbes.com/sites/rashishrivastava/2023/10/09/keep-your-paws-off-my-voice-voice-actors-worry-generative-ai-will-steal-their-livelihoods/?sh=47aa19717b27">Keep Your Paws Off My Voice’: Voice Actors Worry Generative AI Will Steal Their Livelihoods</a>, Rashi Shrivastava, Forbes.com, October 9th, 2023.
As you can see – the whole issue is not new. It just became more famous because of a more famous face, and involves OpenAI, a company that has more questions about their training data than ChatGPT can answer, so the story has sung from rooftops.
Meanwhile, [some are trying to license the voices of dead actors](https://aibusiness.com/nlp/ai-allows-dead-actors-to-narrate-audiobooks ).
[Sony recently warned AI companies](https://www.theverge.com/2024/5/17/24158887/sony-music-ai-training-letter ) about unauthorized use of the content they own, but when one’s content is necessarily public, how do you do that?
[How much of what you post, from writing to pictures to voices in podcasts and family videos, can you control](https://knowprose.com/2024/04/20/paying-to-whitewash-the-fence-of-ai/ )? It costs nothing, but it costs futures of individuals. And when it comes to training models, these AI companies are eroding the very trust they need from those that they want to sell their product to – unless they’re just enabling talentless and incapable hacks to take over jobs that talented and capable people have already do.
We have more questions than answers, and the trust erodes as more and more people are impacted.
https://knowprose.com/2024/05/22/beyond-a-widowed-voice/
#AI #artificialIntelligence #deepfake #openai #ScarlettJohansson #socialMedia #socialmedia #society #Technology #voiceActor #voiceover