Hot potato: Amazon is developing features that will allow the Alexa voice assistant to imitate any human voice after hearing it for less than a minute. Putting aside the potential spookiness of this feature, some are concerned about the potential for abuse.

- Advertisement -

Rohit Prasad, who leads the Alexa team at Amazon, said the goal of the project is to “preserve memories” after “many of us have lost someone we love” as a result of the pandemic.

- Advertisement -

Alexa might be trained to simulate a voice using pre-recorded audio, which means that a person does not need to be present or even alive to serve as the source. In a video shown during this week’s conference, a child asked Alexa if Grandma could finish reading The Wizard of Oz. Of course, Alexa changes her voice to mock the child’s grandmother and finish reading the story.

Prasad said during the presentation that Alexa now receives billions of requests per week from hundreds of millions of Alexa-enabled devices in 17 languages ​​in over 70 countries.

- Advertisement -

The potential for abuse seems high. For example, this tool can be used to create compelling deepfakes for disinformation campaigns or political propaganda. Fraudsters can take advantage of opportunities for financial gain, as in 2020 when scammers scammed bank manager in a $35 million transfer to finance an acquisition that didn’t exist.

What do you think about this? Is Amazon going too far with the concept of voice cloning, or are you intrigued by the idea of ​​”talking” to someone from the grave?

Image credit: Jan Antonin Kolar