On stage at re:Mars this week Amazon showcased an evolving Alexa feature intended to mimic the flow of natural language. A conversation between two people rarely follows a predefined structure. It goes to strange and unexpected places. One subject flows into another as participants inject their life experience.
In a demo, a conversation about trees turns into a conversation about hiking and parks. In the context of the company’s AI, senior vice president and chief scientist for Alexa, Rohit Prasad, refers to the phenomenon as “conversation exploration.” It’s not exactly the right name for a good position. There’s no switch to flip to suddenly allow calls at night. Rather, it’s part of an evolving idea of how Alexa can interact with users in a more human — or perhaps more humane — way.
Smart assistants like Alexa have traditionally delivered a much more simplistic question-and-answer model. Ask Alexa the weather and Alexa will tell you the weather in a predetermined area. Ask her the A-score (or, frankly, probably not), and Alexa will tell you the A-score. It’s a straightforward interaction, not unlike typing a question into a search engine. But then again, real-world conversations rarely happen this way.
“There’s a whole bunch of questions Alexa gets that contain a lot of information. When those questions come up, you can imagine they’re not point questions,” Prasad told TechCrunch in a conversation at the event. “They’re really about something the customer wants to know more about. What’s on our mind right now is what’s going on. happens with inflation we get so many requests to Alexa and it gives you that kind of exploration experience.
However, such call features are how a home assistant like Alexa ramps up. Eight years after its launch by Amazon, the Assistant is still learning: collecting data and determining the best ways to engage with consumers. Even if something gets to the point where Amazon is ready to showcase it on a keynote stage, tweaks are still needed.
“Alexa must be an expert on many subjects,” explains Prasad. “That’s the big paradigm shift, and that kind of expertise takes a while to achieve. This is going to be a journey and with our customer interactions, Alexa won’t know everything from day one. But these questions can evolve into more explorations where you end up doing something you didn’t think you were.”
Seeing the word ‘Empathy’ in big bold letters on the podium behind Prasad was an eye-catcher – although perhaps not as much as what came next.
There are some simple scenarios where the concept of empathy could or should come into play when talking to both humans and smart assistants. Take, for example, the ability to read social cues. It’s a skill we gain through experience – the ability to read the sometimes subtle language of faces and bodies. Emotional intelligence for Alexa is a concept Prasad has been talking about for years. That starts with changing the assistant’s tone to respond in a way that conveys happiness or disappointment.
The flip side is determining a human speaker’s emotion, a concept the company has been working on perfecting for years. It is work that has manifested itself in several ways, including the company’s debut in 2020 controversial portable Halowhich offers a feature called Tone that purported to “analyze energy and positivity in a customer’s voice so they can understand how they sound to others and improve their communication and relationships.”
“I think both empathy and affection are well-known ways of interacting with each other, in terms of building relationships,” Prasad said. “Alexa can’t be tone deaf to your emotional state. If you came in and you’re not in a happy mood, it’s hard to tell what to do. Someone who knows you well will react in a different way. It’s a very high bar for the AI, but it’s something you can’t ignore.”
The director notes that Alexa has already become something of a companion for some users, especially among the older demographic. A more conversational approach would probably only amplify that phenomenon. In demos from Astro this week, the company frequently referred to the home robot as an almost pet-like feature in the home. However, such concepts have their limitations.
“It shouldn’t hide the fact that it’s an AI,” Prasad added. “When it comes down to it [where] it’s indistinguishable – which we are far from – it still has to be very transparent.”
A subsequent video demonstrated an impressive new speech synthesis technology that uses just a minute of audio to create a persuasive approach to a person speaking. In it, a grandmother’s voice reads to her grandson “The Wizard of Oz.” The idea of commemorating loved ones through machine learning is not entirely new. Companies like MyHeritage use technology to animate images of deceased relatives, for example. But these scenarios invariably cause — and understandably — some hacks.
Prasad was quick to point out that the demo was more of a proof of concept, highlighting the underlying speech technologies.
“It was more about the technology,” he explained. “We are a very customer-obsessed science company. We want our science to mean something to customers. Unlike many things where generation and synthesis have been used without the proper ports, this one feels like customers would love it. We need to give them the right controls, including whose voice it is.”
With that in mind, there is no timeline for such a feature – if indeed such a feature will ever exist on Alexa. However, the director notes that the technology that would power him is very active in the Amazon Labs. But again, if it comes, it would require some of the aforementioned transparency.
“Unlike deepfakes, if you’re transparent about what it’s being used for, there’s a clear decision maker and the customer is in control of their data and what they want it to be used for, I think this is the right set of steps. Prasad explains. “This Wasn’t Over” ‘dead grandma.’ The grandma lives in this one, just to be very clear about it.’
When asked what Alexa might look like in the future 10 to 15 years from now, Prasad explains that it’s all about choice, but not so much about infusing individual and unique personalities into Alexa as it is about providing a flexible computing platform. for users.
“It should be able to achieve anything you want,” he said. “It’s not just through the voice; it’s intelligence at the right time, and that’s where ambient intelligence comes in. It should help you proactively and anticipate your need in some cases. This is where we deepen conversational exploration. Everything you are looking for – imagine how much time you spend booking a holiday [when you don’t] have a travel agency. Imagine how much time you spend buying that camera or TV you want. Everything that takes your time to search should become much faster.”
#Alexa #friendly #rabbit #hole