A Blog by Jonathan Low

 

Feb 1, 2019

In-Car AI Assistants Are Coming - Whether You Want Them Or Not

It will be intrusive and presumptuous, purposely programmed to get users (because you pay for but dont really own these systems in the traditional sense) to buy more of whoever supplied the technology wants you to buy. JL

Jonathan Gitlin reports in ars technica:

EVs will use facial recognition as biometric authentication, pulling down your profile from the cloud. But "in the process, it completely adapts to your needs, welcoming you as it adjusts seating, temperature, as well as screen preferences."And you can bet that all of the eye-tracking, emotion-detecting driver monitoring systems will feed that info to whichever onboard AI is running alongside it. In addition to alerts given when it thinks you're sleepy or distracted, you might get asked if you want calming music if the system thinks you're ready for road rage. AI systems are another way to get us to buy things, only this time while we're on the move.
Like it or not, CES has now become a car show, for the same reason we cover the automotive world now at Ars Technica. Simply put, the tech sector has taken a look at the automobile, and it sees dollar signs. Whether or not this vast annual trade show is the right way to kick off a new year (spoiler—it's not), attending CES does have some value in trend-spotting. And this year, the main trend appeared to be "the same product you saw last year, but with AI": AI-enabled TVs, AI-enabled induction cooktops, and yes, AI in cars.
Truth be told, the idea of an in-car AI personal assistant has been around for a while now. I got my first glimpse of this brave new world in 2016 when Audi showed me its concept called PIA (for Personal Intelligent Assistant). Since then, I've heard talk of such AI helpers from more and more car makers, and the technology is getting closer to production.Take BMW—in a couple of months, in some markets, you'll be able to buy a 3 Series (or 8 Series, or X5, or Z4) that includes the company's new Intelligent Personal Assistant as a feature of the new seventh-generation infotainment system. Leveraging some rather good voice recognition (take a bow, Nuance), you can give the car instructions like, "Hey BMW, I'm cold," at which point it will increase the cabin temperature for you. "We expect much higher engagement with voice interaction," said Dieter May, BMW's SVP for digital products and services.
May is far from alone in his intuition. Audi, Chrysler, Daimler, Ford, and Hyundai all also use Nuance's Dragon Drive platform to enable us to speak to our cars (and have them understand us). And it's not the only game in town; both Amazon and Google are also making inroads into the automobile, often with the same automakers. For example, you can already buy BMWs and Toyotas with Alexa integration, and Volvo is adding Google Assistant to its infotainment system. For the smart home user, this means you can do things like ask if your garage door is still open or if you forgot to turn the lights off when you left the house.
As we expand timelines a little further out, the in-car AI promises to do a lot more. Byton, a new electric vehicle startup that should launch its first vehicle in 2020, promises its EVs will use facial recognition as biometric authentication, pulling down your Byton profile from the cloud. But "in the process, it completely adapts to your needs, personally welcoming you as it automatically adjusts seating, temperature, as well as screen preferences."
And you can bet that all of the eye-tracking, emotion-detecting driver monitoring systems we've seen demos of will feed that info to whichever onboard AI is running alongside it. In addition to alerts given when it thinks you're sleepy or distracted, you might get asked if you want to hear some calming music if the system thinks you're getting ready for some road rage. (This probably means your future car will know when you're swearing at it, too.)

Who’s asking for this?

What's to blame for the techification (that's totally a word, honest) of cars? Perhaps it's the Tesla effect. When the Model S arrived with a big touchscreen and over-the-air updates, people started to take notice, and now every OEM will tell you that customers are crying out for a more smartphone-like experience from their vehicles. That's why high-resolution displays are replacing analogue gauges, the old-school (and insecure) CANbus is being joined by automotive ethernet, and powerful GPUs are making their way from gaming rigs into the domain controllers that will consolidate the tens of discrete black boxes that currently run each of a car's many digitally tunable attributes.
In each case, cars represent a lucrative new market for the consumer tech industry. For instance, a company like Aquantia does well for itself selling ethernet solutions to data centers and places like Apple. But those traditional business lines could be dwarfed if it's the one supplying multi-gigabit ethernet systems for millions of new cars each year. And the same applies to companies like Nvidia, Qualcomm, or Intel. All this new fancy hardware needs a reason to be there, after all.
Some of the applications actually sound quite useful. A car that knows when it's about to need servicing sounds pretty helpful, for example. As does being able to easily search for directions. And I've used BMW's current voice recognition system to query the user manual more than once while testing some of its cars. According to Dirk Wollschläger, IBM's general manager for global automotive industry, AI will ensure that, for example, your car doesn't ask you to apply a critical update while you're in the middle of your daily commute.
"For this kind of use case, we’re also using AI. We only want to ship relevant information to the driver, and the end user should be able to define when they want that information. You shouldn’t get an update request if you’re busy driving, but if you’re going to park, it knows you're now parked and can then give you a prompt or whatever. We're working on that with OEMs," Wollschläger told Ars.
But all too often, these AI systems are presented as yet another way to get us to buy things, only this time while we're on the move. And knowing the audience, that's the last thing any of us really want. But maybe I'm being too cynical. Upton Bowden, Visteon’s director of advanced technology development, thinks perhaps I am. "In our research, we've found regional differences on AI assistants. The US is jaded by memories of Microsoft's Paperclip, I think. Whereas in Asia they look at it differently and are keen to have that feature. And as the AI assistant gets better, it will drive utilization," he told me.
Update: The Alexa team got in touch with some statistics to show that in fact, plenty of people are asking for these features, even if few of them are represented in the comments here at Ars: "The Alexa Auto team just wrapped a study with JD Power that surveyed consumers who already own a voice-enabled speaker like the Amazon Echo, and found that 76% of the people who interact with a voice assistant at home say they want the same assistant in their car. Combine that with the fact that customers have already purchased more than 100 million Alexa-enabled devices, and you start to get a picture of what the demand is like."