A Blog by Jonathan Low

 

Nov 27, 2019

Is It Time To Ditch In-Car Touch Screens For Voice Activated Controls?

Complication means distraction. And distraction means accidents.

As screens have become bigger, the impetus has not been to make them simpler, but to fill all that space with information that can then be collected and re-sold. Voice activated controls based on familiar systems like Siri or Alexa could reduce the potential for trouble. But in the on-going battle for consumers' attention, self-driving may offer the only safe alternative to the relentless marketing that connectivity has unleashed. JL


Jonathan Gitlin reports in ars technica:

The high-definition, multicolored glitz of the consumer electronics world has proliferated throughout the (auto) industry, replacing dials and buttons with touchscreens. Whether that's a good thing is up for debate. Screens grew and became more capable, so there was more need to interact with them. "If you use a touchscreen in a car that is complicated, it's distracting." A voice interface in a car that is custom and you have to learn to use it will feel bad. But if you're used to interacting with Siri, or Amazon Alexa, or Google Assistant , then you could bring over that experience into the car."
The primary controls for operating a car are the same today as they were a hundred years ago. We push one foot pedal to speed up, another to slow down, and turn a hand-operated wheel to steer our direction. Over the years, people have suggested joysticks or other radical replacements for controls, but none has proven superior to wheels and pedals. However, when it comes to our other interactions with automobiles, the past decade or so has seen quite the change within our car interiors. The high-definition, multicolored glitz of the consumer electronics world has proliferated throughout the industry, replacing dials and buttons with touchscreens. Whether that's an entirely good thing is up for debate.
It might all be infotainment's fault. In the old days, there were just car stereos. You turned a knob or pushed a button to listen to the radio, inserted some kind of physical media, and if you were really fancy, maybe there were some sliders to change the EQ settings. Soon, small digital screens were appearing in our cars' center consoles, built-in alternatives to the suction-cupped GPS units that all of a sudden rendered the road atlas a thing of the past. Those screens grew and became more capable, so there was more need to interact with them. Dedicated physical buttons have given way to jogwheels, scroll- and touchpads, and then the touchscreen.
One problem with all of these additions is that they can be a distraction from driving. Taking your eyes off the road is bad, and touchscreen interfaces are generally not conducive to developing "eyes-off" muscle memory, particularly if they lack haptic feedback. It's not that touch interfaces are inherently bad, but they do let designers get away with shipping poor user interfaces.
"The problem is that the touchscreen gives people a lot more flexibility in how they would create an interface, which can really quickly lead to complexity," said Mark Webster, director of product at Adobe and an expert on the use of voice in UI and UX design. "What was so fascinating to me about the Navy decision [to replace touchscreen bridge controls following two collisions] was that, I am sure if you look at that interface, it's very complicated. So it really probably isn't the touchscreen. That is in and of itself the problem," he said.
"If you use a touchscreen in a car that is complicated, it's distracting and not a good experience. But something like Apple CarPlay, or Android Auto, that is bringing in an interface that you're really familiar with, that feels natural, intuitive, that you're used to dealing with on your phone all the time. That's actually a place where I think the design of that interface in a touchscreen works really well for that," Webster said.

What happens when there are no more drivers?

A steering wheel and pedals might have gotten us this far, but that may not hold true for the coming decades, as autonomous vehicles begin working as robotaxis in specific geofenced markets. And that's prompting people to rethink how we—as self-loading cargo—interact with those cars. "The driver has always been the center of the car. Once the cars will not need to be driven, then of course, that takes away a big part of what needs to be operated in the vehicle," explained Gil Dotan, CEO of Guardian Optical Technologies.
"The next challenge would be to how you operate a vehicle when you're not necessarily sitting in the front. The vision for us is to enable passenger-aware vehicles—basically create some kind of awareness between the car and the passengers. And that you can only do once you have a good understanding of who's in the car and what is the context of what the people are doing. And then you can create an interface which is much, much more proactive, which is much more suggestive," Dotan told me.
If the idea of your car knowing your emotional state seems highly dystopian, it will be of little comfort to know that demo systems that can do exactly that were on display at CES in January.BMW's 2018 iNEXT concept car had one of the most interesting interiors I've seen of late. I'm not referring to the carpeted seats, nor the screens and center console that look plucked from a trendy hotel—although I am a fan of those aspects, too. Specifically, it thought of new ways for back seat passengers to interact with the car. A projector could beam displays onto different surfaces, and touch sensors underneath the seat fabric let you trace commands and gestures as inputs.
BMW has been an early advocate of gestures, and you can use them to control the volume or take phone calls in some of its newer cars. "I think gestures are an important part of good interface design. I think there's always a learning curve, to get users to understand what is possible—three-finger taps, two-finger taps, pinching and zooming. That's all stuff we needed to learn to interact with a touchscreen on a phone. So gestures will bring along with them a whole other set of UI/UX conventions that we as the design industry both need to establish but also communicate to the user," Webster told me.
Gil Dotan sees the back seat as the perfect place to implement a gesture-based UI. "I'm still surprised when people are talking about it for the passengers in the front. Where, you know, the cockpit is actually designed for the passengers in the front, usually mainly for the driver. But it's quite easy to get to the controls from the passenger side. I think gesture recognition is actually quite important when you're looking at the passengers in the rear. That's where you really have a deficit of interface tools, and understanding what they need and how to convey messages—that's something that can be done. And this is where the focus needs to be put," Dotan explained.
Like Webster, he also recognized the potential challenges of designing an intuitive, gesture-based UI. "For basic things like no and yes, it's quite easy to understand. The more you go into subtleties, the more it becomes challenging, and it has variations between cultures and geographic regions," Dotan said.

Just talk to me

Voice commands will probably find widespread adoption well before touch-sensitive seat cushions interpret our finger doodles as an instruction to change the cabin temperature or play a different track. In-car voice control was pretty ropey until quite recently, but it has come on leaps and bounds as speech recognition and natural language processing algorithms have improved. Like the onslaught of touchscreens, this too has come from the tech sector, as voice assistants have become commonplace in our phones and smart speakers.
"I think a voice interface in a car that is weird and unique and custom and you have to learn all this stuff in order to use it will feel bad. But if you're used to interacting with Siri, or Amazon Alexa, or Google Assistant in other places, then you could bring over all that learning experience and that consistency into the car," Webster told me.
Just how these AI assistants are integrated into the car is something that's still being worked out. "Am I talking to the car? Am I talking to a voice assistant that is representing the car? Is that assistant representing me to the car?" asked Webster."We are intentionally using this skeuomorphic approach and the central model of a conversation with a person in order to advance voice. But there are lots of places where that is not the right model. If I talked to the same assistant that I talked to in my house, but in my car, am I going to think of it in the confines of a smart home where it can open and close windows and turn off lights and interact with music? Can I ask about the car itself? Can I ask about tire pressure? Is it aware of its environment in a different way than it is in the home, or is it not? So the design decisions that we make there will anchor in user heads," he said.

0 comments:

Post a Comment