Amazon’s Alexa impersonates Grandma

178
2
Amazon’s Alexa impersonates Grandma

Those were just some of the reactions that poured in over social media when Amazon.com Inc. s Alexa impersonated a grandmother reading an excerpt from The Wonderful Wizard of Oz. It all started innocently, with Alexa chief scientist Rohit Prasad trying to demonstrate the digital assistant's humanlike mien during a company presentation Wednesday. Prasad said he had been surprised by the companionable relationship users develop with Alexa and wanted to explore that. He said human characteristics like empathy and affect are the key to building trust with people.

In these times of the ongoing Pandemic, these attributes have become more important, because so many of us have lost someone we love, he said. AI can certainly make their memories last, even though they can't eliminate the pain of loss. The presentation left the impression that Amazon was promoting the service as a tool for digitally raising the dead. Prasad walked that back a bit in a subsequent interview on the sidelines of Amazon's re: MARS technology conference in Las Vegas, saying the service wasn't designed to simulate the voice of dead people.

He said it was not about people who aren't with you anymore. But it is about, your grandma, if you want your kid to listen to Grandma's voice, you can do that, if she is not available. The creep factor dominated the discourse as the presentation ricocheted around the web. As well as serious concerns, more serious concerns emerged. One possibility was the possibility of using the technology to create deepfakes - in this case using a legitimate recording to mimic people saying something they haven't actually vocalized.

Siwei Lyu, a professor of computer science and engineering at the University of Buffalo whose research involves deepfakes and digital media forensics, said he was concerned about the development.

There are certainly benefits of voice conversion technologies to be developed by Amazon, but we should be aware of the potential misuses, he said. A predator can disguise as a family member or a friend in a phone call to lure unaware victims, and a falsified audio recording of a high-level executive commenting on her company's financial situation could send the stock market awry. When the new Alexa feature is introduced, similar technology could make mischief a lot easier, as Amazon didn't say when it would be rolled out. Prasad said Amazon had learned to simulate a voice based on less than a minute of that person's speech. Pulling that off required hours in a studio.

The Age of Credibility for Central Banks is no longer over.