CES 2018 saw the proliferation of digital assistant applications in cars (and homes, of course) with Harman International, Panasonic and Visteon showing multiple digital assistant implementations with in-dash infotainment systems. Panasonic showed a hybrid Alexa system capable of working off line and Harman showed a system with a dial to allow the user to select their preferred digital assistant: Alexa, Google Voice, Cortana or Bixby.
The leader in hybrid automotive speech recognition systems, Nuance, demonstrated a system capable of automatically selecting the appropriate digital assistant depending on the task. Meanwhile, German Autolabs was demonstrating an aftermarket device with multiple no-look, no-touch human machine interface options – including speech recognition and gesture – for communicating while driving.
For German Autolabs the over-arching message was clear: the age of de-appification had begun. Not everyone got the message in 2018, but CES 2019 is arriving in two weeks with an escalation in digital assistant integration.
No one is obsessing over recognizing accented voices or quibbling over when speech recognizers will be acceptable. Digital assistants have arrived and car makers and their suppliers are being forced to reckon with the consequences.
It’s not simply a question of accessing vehicle functions or cloud content or service resources. Voice interaction in the car is rapidly turning cars into browsers and driver and passengers queries into actionable and monetize-able inputs.
De-appification, an expression coined by German Autolabs CEO, Holger Weiss, refers to the reality that drivers and passengers will no longer be accessing content, applications and services via on-screen icons. The world of content and service delivery in the car will be an eyes-free and hands-free experience driven by voice.
More importantly, an increasing portion of the recognition and the process of responding to and/or anticipating driver and passenger needs will be supported by on-board artificial intelligence. The car will become more intelligent through the process of coordinating on-board sensor inputs, mobile device content and service information and cloud resources.
CloudMade was one of the first companies to demonstrate this capability. The competition to deliver this value proposition in 2019 and beyond will be fierce.
The entire vehicle will become an intelligent digital assistant anticipating driver needs and enhancing the driving experience. The most advanced systems will integrate with safety systems creating the opportunity for the vehicle to communicate and converse with the driver like the computer in “2001: A Space Odyssey” or like “Kit” in “Knight Rider.”
The implications for the development of in-vehicle systems in 2019 are significant and include:
- The launch of OEM-branded systems such as “Hey BMW” and “Hey Mercedes” as front end interfaces to cloud partners including Alexa, Bixby, Google Voice, Cortana and Siri;
- The integration of smarthome digital assistants – such as Orange’s new Djingo – with vehicle-based systems;
- The capture of these driver (and passenger) requests to better anticipate driver needs and wants for integration with contextual marketing and payment platforms;
- The demise of app-and-icon-centric in-vehicle user interfaces in favor of voice-and-gesture-centric systems. Like “Chris;”
- The near elimination of human-centric call centers for concierge, roadside assistance and emergency services;
- The enhancement of emergency response with artificial intelligence systems capable of instantly determining vehicle, driver and passenger status and automatically communicating the appropriate information to first responders and next of kin;
The enhancement of customer relationship management systems integrating with dealers to build stronger customer retention programs.
Multiple suppliers will use CES 2019 to demonstrate platforms designed to collect and interpret vehicle data. The next phase in this evolution, though, will be the integration with artificial intelligence and digital assistance systems intended to bring vehicles to life with a smarter, safer and more productive operating environment.
Will there by hiccoughs ahead? Definitely. Natural language systems capable of carrying on layered dialogues are still evolving and will take time to see market introduction. But it is not going too far to suggest that in-vehicle speech systems, by the end of 2019, will be capable of conversing with drivers to either help preserve alertness or to establish that a respite from driving is required.
In the end, what started out as a handy tool for the hands-free entry of destinations, the dialing of phone numbers or the selections of songs or radio stations, will begin saving lives with timely alerts and guidance. CES 2019 will usher in this new age of voice-based driver assistance.
Roger C. Lanctot is Director, Automotive Connected Mobility in the Global Automotive Practice at Strategy Analytics. Roger will keynote the Consumer Telematics Show on January 7 at Planet Hollywood. More details about Strategy Analytics can be found here:https://www.strategyanalytics.com/access-services/automotive#.VuGdXfkrKUkShare this post via: