Ask a vehicle chief engineer or program manager to name the areas most critical for delighting the end customer, and among his or her top responses will likely be “center stack” and human-machine interface (HMI). Consumers expect more technology, features, and functionality in the latest infotainment systems, which OEMs and their suppliers must deliver in increasingly shorter design cycles.
The key to superb execution here is thorough up-front planning and clear communications with the user interface (UI) development team. Two experts in the field, Steve Tengler and Peter Abowd of Altia, a company that specializes in HMI tool development, have created the following scenario aimed at demystifying the process and educating AEI readers. Tengler and Abowd manage a team of global engineers that assists automotive OEMs and Tier 1 suppliers with developing “user experiences” using Altia's software tools and processes.
In the scenario, Tengler and Abowd meet with an automotive executive in charge of infotainment planning to discuss the jurisdiction or focus of a user experience (UX) team, as Altia calls it. The exec needed a visual to describe to her upper management where they should focus in the future and wanted to talk through the visual before sitting down to e-paint.
Exec: It’s mostly designing the screens in the vehicle, right? As in "key on" and what appears, yes?
Tengler: Not by a long shot. That would be a portion of the user interface, but the automotive experience now spans far beyond the vehicle both when driving and parked.
Tengler: The experience is everything having to do with that car and, by some, could spread as far as dealership introduction all the way to the third owner’s browsing of the users’ manual. But let’s start with the minimum; there are multiple interfaces that are now connected to the vehicle.
Exec: Such as the phone.
Abowd: Exactly. That phone app should, in theory, be designed in concert with the user interface of the vehicle to give a cohesive experience, as well as with the website where they download destinations or change user settings. All of those now interact with the user and the vehicle’s middleware when the customer is possibly thousands of miles from the vehicle, but it is still part of the experience.
Exec: So the definition should include those screens as well then. I get it.
Tengler: Screens are certainly the most pervasive these days, but don’t forget it’s about more than screens. You’ve got other modalities, too.
Exec: Modalities? So any output to the user?
Tengler: Outputs and inputs. Any interaction with the user should assist the target customer—the persona―towards meeting his goals. That’s the basis of user-centered design (UCD). Voice recognition, touch pads, gesture, you name it. Some are even dabbling with telepathic, olfactory, and other sensory engagements.
Exec: Okay. So my picture should show the screens and any other physical devices with which the customer interacts for either inputs or outputs, and the UX team’s jurisdiction should be the definition of how these interplay to meet the customers’ goals. Correct?
Tengler: You’re close. Before I forget, modeling that experience is key so the UX team may test it with customers and iterate/improve the design towards the targeted persona. That’s how you make sure you’re meeting the goals. But let me expand your picture slightly. In some cases, the user is also interacting with an adviser who, in addition, has a user interface at his/her adviser’s panel.
The usability of that panel can affect the user experience delivered to the automotive customer and, therein, needs to be tested in conjunction with the rest of the experience. Finding a point-of-interest (POI) quickly or delivering emergency services can make a significant impression on a vehicle’s UX.
Exec: Yikes! So we’re talking the interfaces of interfaces?
Tengler: Well, yes. If you want to lessen the definition, you might lessen the experience. All of this needs to be modeled and tested in conjunction. When the adviser finds a POI, how does that get delivered, accepted, and handed off smoothly for the customer? That’s what he cares about, and therefore that’s what the UX team needs to care about.
Exec: And what about the operating system or OS?
Abowd: What about it?
Exec: How does that play into all of this?
Abowd: If the operating system is doing its job properly, no one knows it is there, especially the users of any application running on top of the OS. Why the OS is even getting considered into the equation is because of the explosion of "apps" which have invaded the mobile handset market and now entered the automobile. It is, in fact, the OS that enables a product to load and run new applications after a consumer has purchased the product.
Exec: Yeah, but doesn’t my choice of operating system affect the user experience?
Abowd: Well, that is where the confusion sets in. The operating system, which allows the new apps to be loaded after the product has been released, now enables any developer to create something that runs on the product. This new app comes with its own user interface and likely one that was not created for the automotive environment.
But this new app’s user interface is not a function of the operating system; it is simply the result of the developer’s own work. It is even distinctly different from the UI of the product on which it is running.
For example, an iPhone has its iOS, and on that iOS runs the iPhone base UI, which presents things like the many pages of icons, the 4 quick-select icons (at the bottom of the screen) and the running icon list when you double tap the button. That comprises the lion’s share of the iPhone’s UI. iOS and the App Store allow the user to download new apps and add them to the icon pages.
When a user taps on the new app’s icon, that app now runs and presents its own, completely independently developed UI. After a while the iPhone is filed with a wide diversity of apps with widely diverse UIs, even though they all use the same basic input and output modality, the touchscreen.
Abowd: Yes, it has really complicated managing the user experience in the vehicle now that the doors are open for anyone to create an app for inside the car. This situation is clouded if product developers are confusing product OS, product-native UI, and all the apps’ user interfaces.
Tengler: And right now the definition only includes a singular vehicle. Imagine when there’s an interconnected network of vehicles and/or infrastructure. Each of these cars will have new apps with dynamically updating human-machine interfaces and they will have new, urgent, safety-related messaging from a multitude of cars. That’ll eventually take the user experience to a whole new level.
Exec: This is getting complex! Could you throw together that picture for me?