Amazon Alexa Dead Relatives
FILE - Amazon Echo and Echo Plus devices, behind, sit near illuminated Echo Button devices during an event by the company in Seattle on Sept. 27, 2017. Amazon’s Alexa might soon replicate the voice of family members - even if they’re dead. The capability, unveiled at Amazon’s Re:Mars conference in Las Vegas Wednesday, June 22, 2022, is in development and would allow the virtual assistant to mimic the voice of a specific person based on a less than a minute of provided recording. (AP Photo/Elaine Thompson, File)
tech

Amazon's Alexa could soon mimic voice of dead relatives

13 Comments

Amazon’s Alexa might soon replicate the voice of family members - even if they’re dead.

The capability, unveiled at Amazon’s Re:Mars conference in Las Vegas, is in development and would allow the virtual assistant to mimic the voice of a specific person based on a less than a minute of provided recording.

Rohit Prasad, senior vice president and head scientist for Alexa, said at the event Wednesday that the desire behind the feature was to build greater trust in the interactions users have with Alexa by putting more “human attributes of empathy and affect.”

“These attributes have become even more important during the ongoing pandemic when so many of us have lost ones that we love,” Prasad said. “While AI can’t eliminate that pain of loss, it can definitely make their memories last.”

In a video played by Amazon at the event, a young child asks “Alexa, can Grandma finish reading me the Wizard of Oz?” Alexa then acknowledges the request, and switches to another voice mimicking the child’s grandmother. The voice assistant then continues to read the book in that same voice.

To create the feature, Prasad said the company had to learn how to make a “high-quality voice” with a shorter recording, opposed to hours of recording in a studio. Amazon did not provide further details about the feature, which is bound to spark more privacy concerns and ethical questions about consent.

Amazon’s push comes as competitor Microsoft earlier this week said it was scaling back its synthetic voice offerings and setting stricter guidelines to “ensure the active participation of the speaker” whose voice is recreated. Microsoft said Tuesday it is limiting which customers get to use the service -- while also continuing to highlight acceptable uses such as an interactive Bugs Bunny character at AT&T stores.

“This technology has exciting potential in education, accessibility, and entertainment, and yet it is also easy to imagine how it could be used to inappropriately impersonate speakers and deceive listeners,” said a blog post from Natasha Crampton, who heads Microsoft’s AI ethics division.

© Copyright 2022 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.

©2022 GPlusMedia Inc.

13 Comments
Login to comment

I wonder how they are planning to handle the very real possibility for abuse this technology means.

4 ( +5 / -1 )

Creepy!

6 ( +7 / -1 )

That is just creepy. I don’t think this would be a good thing for people dealing with mental health issues.

5 ( +6 / -1 )

There is a line in technology which should not be crossed. This crosses such a line.

4 ( +5 / -1 )

I had a cousin who left a voice memo on my answering machine before they died. I used to listen to it periodically, it was nice to hear their voice.

Not sure if I want my AIs using it though...

3 ( +4 / -1 )

Not ok...

2 ( +3 / -1 )

'I hear voices of dead people' could be a line from a horror movie

2 ( +2 / -0 )

Nice plot for a murder mystery. “Yes, detective, he claims his grandma told him to do it, but grandma has been dead for years...” Has Amazon really thought this through?

1 ( +1 / -0 )

People who don’t want to use it don’t have to… nobody is putting a gun to their heads forcing them to use it! It’s really good for people who miss their loved ones after they pass on and can keep them around as it helps them!

-1 ( +0 / -1 )

I'm sorry but I don't want the voice of my deceased loved ones coming out of a machine. That's just some next level creepiness.

2 ( +2 / -0 )

purple_depressed_baconJune 25  08:33 am JST

I'm sorry but I don't want the voice of my deceased loved ones coming out of a machine. That's just some next level creepiness.

This has the same notion to me of those 'touring' hologram shows (like ABBA). It's all based on what was said or done by that person before. And I wouldn't want a 'ghost' talking to me, that's basically what it is. A ghost in the machine. No thanks.

1 ( +1 / -0 )

Just what everybody wants, their parents scolding them from the grave !

4 ( +4 / -0 )

@desert

classic !!

1 ( +1 / -0 )

Login to leave a comment

Facebook users

Use your Facebook account to login or register with JapanToday. By doing so, you will also receive an email inviting you to receive our news alerts.

Facebook Connect

Login with your JapanToday account

User registration

Articles, Offers & Useful Resources

A mix of what's trending on our other sites