Alexa goes down the conversational rabbit hole

- Advertisement -

On stage re:Mars This week, Amazon showed off an Alexa feature in development designed to mimic natural language flow. A conversation between two people rarely follows any predetermined structure. He travels to strange and unexpected places. One theme flows into another as participants share their life experiences.

- Advertisement -

In the demo, talking about trees turns into talking about hiking and parks. In the context of the company’s artificial intelligence, Alexa Senior Vice President and Chief Scientist Rohit Prasad calls this phenomenon “conversation research.” Exactly, that’s not the right name for the right function. There is no switch that flips to suddenly turn on overnight conversations. Rather, it’s part of an evolving vision of how Alexa can interact with users in a more human – or perhaps more humane – way.

- Advertisement -

Smart assistants like Alexa have traditionally offered a much more simplified question and answer model. Ask Alexa about the weather and Alexa will tell you about the weather in a predefined area. Ask her for an A (or frankly, most likely not) and Alexa will give you an A. This is a simple interaction, similar to entering a question into a search engine. But then again, real-world conversations rarely end that way.

“There are a number of questions that Alexa receives that carry a lot of information. When questions like this come up, you can imagine that these are not point questions,” Prasad told TechCrunch in a conversation at the event. “They are really about something that the client wants to know more about. Now we are most concerned about what is happening with inflation. We get a lot of requests like this on Alexa and it gives you such an exploration experience.”

- Advertisement -

However, such conversational features are what a home assistant like Alexa resorts to. Eight years after Amazon launched, the assistant is still learning, collecting data and identifying the best ways to interact with consumers. Even when it gets to the point where Amazon is willing to showcase it on the main stage, it still needs some tweaking.

“Alexa has to be an expert on many topics,” Prasad explained. “This is a big paradigm shift and it takes time to gain that kind of experience. It will be a journey, and through interaction with our customers, Alexa won’t know everything from day one. But these questions can develop into new research, as a result of which you will do something that you did not even think about.

Seeing the word “Empathy” written in big, bold letters on the stage behind Prasad was dizzying – although perhaps not as much as what followed.

There are a few simple scenarios where the concept of empathy can or should be kept in mind while talking to both humans and smart assistants. Take, for example, the ability to read social cues. This is a skill that we acquire with experience – the ability to read the sometimes subtle language of faces and bodies. Emotional intelligence for Alexa is a concept that Rashad has been discussing for years. This begins with changing the assistant’s tone so that they respond in a way that conveys joy or frustration.

The other side is the definition of the emotions of the person speaking, a concept that the company has been working on for several years. This is work that has manifested itself in many ways, including the company’s debut in 2020. controversial wearable Halowhich offers a feature called “Tone” designed to “analyze the energy and positivity in a client’s voice so they can understand how it sounds to others and improve their communication and relationships.”

“I think both empathy and affect are well-known modes of interaction in terms of relationship building,” Prasad said. “Alexa cannot be deaf to your emotional state. If you’ve walked in and are not in a good mood, it’s hard to tell what you should do. Someone who knows you well will react differently. This is a very high bar for AI, but it cannot be ignored.”

The executive notes that Alexa has already become a kind of companion for some users, especially among the elderly. A more conversational approach is likely to exacerbate this phenomenon. In this week’s Astro demos, the company frequently referred to the home robot as having an almost pet-like function at home. However, such representations have their limitations.

“It shouldn’t hide the fact that it’s AI,” Prasad added. “When it comes to [where] it is indistinguishable – which we are very far from – it still has to be very transparent.

The follow-up video showcased an impressive new voice synthesis technology that uses just a minute of sound to create a convincing approximation of a speaking person. In it, the voice of a grandmother reads The Wizard of Oz to her grandson. The idea of ​​memorializing loved ones with machine learning is not entirely new. Companies like MyHeritage use technology to animate images of deceased relatives, For example. But these scenarios are invariably—understandably—annoying.

Prasad immediately noticed that the demonstration was more of a proof of concept, highlighting the underlying voice technology.

“It had more to do with technology,” he explained. “Our science company is very client-oriented. We want our science to mean something to our clients. Unlike a lot of things where generation and synthesis have been used without the right gateways, this looks like something customers will love. We have to give them the right set of controls, including whose voice it is.”

With that in mind, there is no time frame for such a feature – if such a feature actually ever exists in Alexa. However, the chief executive notes that the technology that was supposed to power it works very well in Amazon’s labs. Although, again, if it does show up, it will require some of the aforementioned transparency.

“Unlike deepfakes, if you are transparent about what it is used for, there is a clear decision maker and the client is in control of their data and what they want it to be used for, I think that is the right set of steps. . Prasad explained. “It was not about “dead grandmother” Grandma is alive in this, just to be perfectly clear.”

Asked what Alexa might look like in 10 to 15 years, Prasad explains that it’s all about choice—though not so much about imbuing Alexa with individual and unique personalities, but about offering users a flexible computing platform.

“He should be able to do whatever you want,” he said. “It’s not just through the voice; it is the intellect at the right moment, and this is where the ambient intellect comes into play. In some cases, he must actively help you and anticipate your needs. Here we continue our conversational exploration. Everything you are looking for – imagine how much time you spend booking a vacation [when you don’t] have a travel agent. Imagine how much time you spend on buying the camera or TV you need. Anything that requires you to spend time searching should be much faster.”

Credit: /

- Advertisement -

Stay on top - Get the daily news in your inbox

DMCA / Correction Notice

Recent Articles

Related Stories

Stay on top - Get the daily news in your inbox