Amazon wants its Alexa voice assistant to follow consumers wherever they go.
This week, the company discussed at its annual Alexa Live developer summit features and tools to make Alexa a component in everything from smartphones and cars to smart speakers and household appliances. Enhancements to the platform included a better understanding of natural language to deliver services.
The company also announced that its Alexa-enabled Echo devices would support the Matter interoperability protocol used by smart home device manufacturers. To give hardware makers more options, the company introduced a toolkit for making devices with multiple assistants to let consumers decide which to use.
Amazon’s Alexa advancements aim to ride consumer and business adoption of voice assistants. It is said that by 2025, one in two knowledge workers will use virtual assistants daily, compared with 2% in 2019.
A 2018 Pricewaterhouse Coopers survey found that 90% of consumers were familiar with voice-enabled devices. Also, more than 70% preferred to use their voice to search for products, services or anything else online instead of physically typing.
Voice industry professionals globally see Amazon Alexa as the leading voice assistant development platform, according to a survey by Voicebot.ai. The research and news organization found Google Assistant was second, with Apple Siri a distant third.
At Alexa Live, Amazon rolled out tools that let developers choose up to five suggested phrases to launch Alexa-powered apps, called Skills. The additional options would free users from remembering specific terms and make Alexa seem more understanding of what the user wants.
The enhancement is part of Amazon’s name-free interaction (NFI) preview available in the U.S., the U.K. and India. The NFI tool collection lets developers use common words to launch Skills.
Amazon plans to expand Alexa’s reach in smart homes through an Echo device update that will add support for Matter, a communication protocol developed by the Connectivity Standards Alliance.
Amazon’s new Multi-agent Experience (MAX) toolkit provides developers with middleware components that let consumers choose the assistant on a device that should answer their query. Samsung used the technology to make its latest Family Hub smart refrigerator respond to commands over Alexa or Bixby. The latter is Samsung’s voice assistant.
MAX is the latest advancement in Amazon’s Voice Interoperability Initiative (VII), launched last year. More than 80 companies support the initiative, including Facebook, Salesforce, Verizon, and consumer device makers Sony and SONOS. Google and Apple are not VII members.
Amazon has taken Alexa to more devices by licensing a white label version. In June, auto electronics maker Continental and software developer Elektrobit introduced the first in-vehicle voice-activation system that uses Alexa to perform tasks. Possible uses include launching an audiobook, adjusting windows and mirrors, or turning the air conditioning on and off.
Amazon’s ambitions for Alexa could get derailed if the company fails to convince consumers to trust it with the amount of personal data needed for Alexa to become an intuitive assistant.
Ultimately, Amazon would use that data to sell more products and services. At the same time, it will have to be transparent on the information collected, its use and who has access to it.