Car voice commands won't suck with Nuance's assistant

Prompted by an activation phrase, Dragon Drive recognizes a driver named Lior by his voice.


Wayne Cunningham/CNET Roadshow

Voice command in cars shows so much potential to help drivers keep their eyes on the road, but since its implementation, the technology largely resulted in frustration. Sure, placing a call to a specific contact usually works, but just try finding a destination in the navigation system. It becomes worse when the car doesn’t show what commands it understands.

Nuance, the company behind the majority of voice systems in cars, thinks it has the problem licked through the use of machine learning and the cloud, essentially equipping cars with a virtual assistant.

Eric Montague, Nuance’s senior director of product marketing and strategy, says the company’s latest automotive platform favors a hybrid approach, embedded with dashboard electronics but supplemented by support from the cloud. Services such as Siri, Android Assistant and Amazon’s Alexa show the accuracy and flexibility of cloud-based voice command. Nuance’s embedded technology lets the system work, in limited form, even without a data connection.

Cloud and car

To show off the latest advances in Dragon Drive, Nuance’s automotive voice command platform, Montague dropped by Roadshow headquarters in a modified Chrysler Pacifica sporting six microphones, two for each seating row, and a dashboard fitted with the company’s latest electronics.

This screen, which a production version of the technology would not display, shows the variety of parameters Dragon Drive uses to identify a parking garage.


Wayne Cunningham/CNET Roadshow

When telling the car, “find covered parking near Fort Mason from 3 p.m. to 7 p.m.”, the screen showed a “thinking” message for a moment, then came up with a map highlighting a few parking areas that met the criteria. Montague touted Nuance’s database, which through partnerships with mapping company Here and Parkopedia, includes a number of parameters for parking lots, such as pricing, whether they take credit cards and hours of operation. Nuance’s voice engine processes natural language requests to find the best matches.

But rather than rely only on what drivers say, Nuance designed the system to know enough about individual drivers to also choose a lot based on past preferences. That’s where machine learning comes in. As Montague pointed out, a driver might choose parking lots that cost less, but require a five to 10 block walk to the final destination. Dragon Drive could learn that preference over a short time, then highlight those options in its results.

As Montague put it, “Nuance doesn’t want to just give a result, but give the best results.”

And, of course, Dragon Drive could include more than just parking information. As an in-car assistant, Montague said Nuance is integrating fuel station information and anything else relevant to driving. Likewise, the Nuance cloud will interact with other cloud services, such as Amazon Alexa. And while Siri and Android Assistant can do navigation, Montague points out that these systems don’t have deep integration with a car, so don’t know its fuel level, or whether the windshield wipers are on. Nuance can use all this data to make its in-car assistant more useful, for example finding a covered parking spot if the windshield wipers are on, a pretty sure sign that it is raining.

Your voice is your passport

Taking the in-car assistant idea further, Montague showed how Dragon Drive can use biometric voice recognition to identify individuals.

As different people in the demonstration vehicle said the activation phrase, “Hey Dragon”, the car recognized those who had a registered profile, and also gave guest access to others. For Montague, who had registered a profile, the car made his personal contacts and calendar available.

Montague said that Nuance is looking into facial recognition to enhance security for its voice recognition biometrics. This technology plays well in a sharing economy, as a driver’s personal profile would be stored in Nuance’s cloud. In this scenario, any car a driver gets in could recognize their voice, and make available personalized destinations, contacts, calendar appointments and music preferences.

To show off how well its voice recognition biometrics worked, the test vehicle included a quiz game. Rather than push a buzzer, the first of the four participants in the car merely had to say “Got it”, then answer the question. Nuance’s system recognized which player spoke first, and only accepted answers from that person, even while others in the car talked.

This video from Nuance shows a concept scenario for its voice recognition technology.

A change is going to come

No production car includes Nuance’s latest Dragon Drive voice command system yet, and Montague would not name any manufacturers who will implement it, as automakers like to make these sorts of announcements themselves. However, the company already supplies voice command systems to most major automakers. As automakers add dedicated data connections to their cars, it won’t be a big step to add Nuance’s hybrid embedded-cloud voice system.

The newest BMW 7-series, which launched in 2015, uses this hybrid model for its voice system, and can process “800 intents”, as Montague calls Nuance’s natural language-style of voice commands. BMW will likely upgrade its other models with more advanced versions of Nuance’s system as they come out.

Given the development cycles for cars, it will probably be three to five years before the technology Nuance demonstrated this week find its way into production. But when it does, drivers should find a helpful virtual assistant who can eliminate the need for the kind of button-pushing we all do in current cars.

Detroit Auto Show 2017: Everything that happened at the biggest car show of the year.

Favorite concept cars at the 2017 Detroit Auto Show: The auto industry’s coolest moonshots.

Scroll to Top