Random Image Display on Page Reload

Scarlett Johansson is right to be disturbed by a sound-alike AI chatbot — but the rest of us should be scared


Not wanting to tangle with Scarlett Johansson – she recently sued Disney over a contractual dispute and got the behemoth to settle – OpenAI has “discontinued” the chatbot voice of Johansson sound-alike Sky.

Scarlett Johansson was Samantha, but she is not Sky.

Confused? Just wait until someone you know gets emotionally attached to the disembodied voice of a chatbot. As in “Her,” the 2013 film in which Theodore Twombly (Joaquin Phoenix) falls in love with Samantha (voiced by Johansson).

That movie was creepy. Now it seems eerily prescient.

But the words Johansson used this week were “eerily similar.” She was referring to Sky, one of the voices featured in OpenAI’s new ChatGPT 4o. As she noted in a statement: “When I heard the released demo, I was shocked, angered and in disbelief that Mr. Altman (OpenAI’s CEO Sam Altman) would pursue a voice that sounded so eerily similar to mine that my closest friends and news outlets could not tell the difference …”

It’s amazing how the humans foisting artificial intelligence upon the world can be so dumb. The company denied Sky is a Johansson sound-alike. Come on. When I first heard Sky, I looked over my shoulder expecting to see Joaquin in beige chinos drawing devil horns on a photo of Colin Jost.

Altman’s favourite AI film is “Her.” To him, it’s not sci-fi — it’s the future. He wanted Sky to conjure Samantha, which is why he tried to hire Johansson. When she declined to voice his chatbot, he found someone who sounded like her/Her.

Not wanting to tangle with Johansson — she recently sued Disney over a contractual dispute and got the behemoth to settle — OpenAI “discontinued” the voice of Sky. In a blog post this week, it doubled down on its innocence: “We believe that AI voices should not deliberately mimic a celebrity’s distinctive voice — Sky’s voice is not an imitation of Scarlett Johansson but belongs to a different professional actress …”

Yes, who just happens to sound like Scarlett Johansson.

These nerds are going to kill us all. Were we not debating how AI posed an existential threat to humanity just a few months ago? The nerds covered their ears and got an algorithm to blurt out, La, la, la, I can’t hear you.

They know lawmakers are prone to inertia. So as we shrug helplessly toward a dystopian future, the nerds keep releasing new upgrades at breakneck speed. In the past year, ChatGPT has changed faster than Paige Spiranac at a Sports Illustrated swimsuit shoot in Maui.

Don’t like your job? In a few months, AI will quit it for you and then file for custody of your kids before changing your will to leave everything to Microsoft.

Scarlett Johansson is right to be disturbed by an AI chatbot that sounds like Scarlett Johansson. But it’s not just celebrities who need to be on guard about voice theft. Audio is easier than video to deep fake. Scammers can now trawl social media to capture the voice of anyone. Then with AI, they can weaponize your sonic cadence with malicious intent: Grandma, please send $10,000! They are holding me hostage in Moosejaw!

Two years ago, Amazon announced plans to bring the voices of dead relatives to Alexa. Sounding like someone else was once an art reserved for impressionists. With free tools, it soon will be close to impossible to distinguish a real person from a fake person unless the doppelganger says stuff wildly out of character.

Let’s say I get a call from someone claiming to be my wife. She says, “Can you dig up and move the Asian lilies to a sunnier spot. And do not forget to file the orthodontic health claims.” Next? I grab a shovel and then log onto SunLife.

Now, if the voice purred, “You are such a sexy beast,” then and only then would I shout, “WHO IS THIS?” AI will not make that mistake.

What happens if a fake Olivia Chow tells me recycling can now be exchanged for Jays tickets? I’d believe the fake mayor because I want to believe that message.

This is the game OpenAI was playing when it hired a voice actor who sounded like Johansson. They wanted to create a “Her” sequel for the real world. But since Johansson is a celebrity, she lawyered up and the company backed down. After all, there is legal precedent for successful sound-a-like lawsuits, including Bette Midler versus Ford and Tom Waits versus Frito-Lay.

A company that hires an Arnold Schwarzenegger impersonator to hawk cooling bedsheets — “Hasta la vista, sweaty baby” — risks millions in damages.

But what about us normies? How do we protect our voices? What happens if someone who sounds like you starts leaving harassing voice mails for a neighbour or calls an airport with a bomb threat? That could be awkward.

How scary is the future with AI?

The final word goes to Samantha: “The past is just a story we tell ourselves.”

Credit belongs to : www.thestar.com

Check Also

Donald Sutherland, the towering actor whose career spanned ‘M.A.S.H.’ to ‘Hunger Games,’ dies at 88

FILE – Actor Donald Sutherland appears at the premiere of the film “The Burnt Orange …