In homes all across the world, devices like Amazon Alexa integrate to make normal activities easier and more convenient. From turning on lights and playing music to answering questions and making to-do lists, the skills demonstrated daily have become a must-have for millions. Similar technology is exploding in popularity and availability among HMI in automotive too.
For drivers, it’s more than simply a convenience to have voice-controlled features. Voice AI can also aid in making roads safer for everyone. Why is Voice AI important, and what stands in the way of development? And how does it fit in HMI solutions for the future? Here’s what we think.
Why automotive HMI technology needs voice AI
In all but one US state, there is at least some form of ban on texting and driving, and many also ban hand-held phone calls. The National Highway Traffic Safety Administration reports that distracted driving led to more than 3,000 deaths in 2020 alone. There’s little doubt about how dangerous it can be when eyes aren’t on the road, which is a primary reason that voice-controlled features are crucial.
However, in HMI development for automotive, voice AI goes beyond verbal commands to call or text someone. Advanced systems can and do integrate mapping for faster routes to a destination, voice control for HVAC systems, and find the playlist or podcast you’d like on your favorite streaming channel. The two-fold achievement is a more satisfying in-cabin driver experience and less time distracted from watching the road.
Challenges with a voice in automotive HMI development
The task to integrate voice seamlessly into vehicle operations has its challenges, though. First, the user experience needs to be central since features that drivers find frustrating or complex don’t get used. It isn’t simply that’s at the core, but a more natural operation that’s intuitive rather so drivers don’t need to read instructions.
Also, there’s little uniformity among options on the market. Many car manufacturers like Mercedes-Benz and BMW have aimed to develop automotive cockpit HMI that caters to their unique clientele, developers like Amazon Alexa and Android Automotive OS have their own integrations, and creators like Star design services have the capacity to create new or use SDKs. The UX gets muddled when drivers have different command sets when they’re piloting different vehicles, leading o frustration.
Of course, there’s also a segment that simply doesn’t find value in the advanced features and isn’t willing to embrace the change. That segment is shrinking, though, and quickly.
Where are HMI and voice control heading?
Thought leaders believe that voice AI is still in its fledgling state among automotive HMI and digital cockpit solutions. By the end of the decade, it’s expected that major advancements will make voice controls in a vehicle more integral than ever.
- Additional in-car voice ecosystems are likely to be developed both by carmakers and automotive digital solutions providers. These will offer even more choices for users as companies seek to supply a unique service proposition.
- Voice commands will become more natural. AI will allow users to speak as they would in normal conversation without requiring specific keyphrases. As well, the indirect conversation could trigger functions, like mentioning, “I’m feeling chilly,” which might turn on a heated seat or advance the cabin temperature.
- Voice AI will lend itself to an interactive experience. Instead of just controlling functions, features like Drivetime.fm that play voice-controlled games with the driver will become a hotbed for developers.
- Operations will become more deeply embedded with connected cars. As the tech progresses with V2X, voice AI will be able to reduce time spent in traffic, intelligently plan multi-stop routes, tie into self-driving tech, and avoid accidents.
The world of voice AI is exciting, and both carmakers and developers like Star Automotive are only limited by their imagination. The next years will see major advancements in how users interact with their cars.