Android Auto, a new voice-enabled in-car operating system that enables car dashboards to run navigation, communication, and music applications that reside on Android smart phones, was one of the main announcements at the recent Google I/O 2014 developers’ conference in San Francisco. The new connected-car platform, known as Google Auto Link during development, is the first product from the Open Automotive Alliance, a collaboration of 40 industrial partners that includes Audi, General Motors, Honda, Hyundai, and Volvo that was launched earlier this year.
Android Auto resembles Apple’s CarPlay, which brings the look of Apple’s iOS mobile operating system as well as some of the iPhone’s apps to the car’s center display. CarPlay, which was unveiled in March at the 2014 Geneva Auto Show, is to appear in Honda, Jaguar, Mercedes-Benz, and Volvo models later this year.
The fact that big players such as Google and Apple have joined the likes of Abalta Technologies, Airbiquity, Bosch, Harman, Luxoft, Nuance Communications, OpenCar, RealVNC, and UIE in developing technology that integrates smart phones into cars highlights the inexorable computerization of the automobile.
This trend toward embedded connected-car systems places greater focus on the powerful IC chipsets that will be needed to run increasingly sophisticated digital instrument clusters and infotainment features as well as the advanced driver-assistance systems (ADAS) that are now opening the way toward self-driving cars to come. ADAS functions include speed-limit-sign recognition, blind-spot monitoring, lane-departure detection, pedestrian detection, collision avoidance, assisted parking, and enhanced night vision capabilities.
Nvidia inside
A key member of the Open Automotive Alliance is Santa Clara-based chipmaker Nvidia Corp., whose system-on-a-chip (SOC) modules today power infotainment, instrument cluster, and rear-seat entertainment systems in more than 35 car models made by a half-dozen foreign and domestic car makers, according to Daniel Shapiro, Senior Director for Automotive at Nvidia. Shapiro spoke at the recent Connected Car Conference (CE Week) in New York.
About 5.1 million cars on the road today have Nvidia chips in them, Shapiro reported, but as yet none are used for safety and driver-assistance functions. In the next few years, more than 25 million more cars from OEMs such as Audi, BMW, Porsche, Tesla, and Volkswagen will have Nvidia inside, he predicted, although even then only a small proportion of those vehicles will employ the SOC modules for first-generation autonomous driving functions. European models that incorporate Nvidia chipsets are likely to be unveiled at the Paris Auto Show this fall.
When Audi arrived at International Consumer Electronics Show in Las Vegas last January, it demonstrated an A7 sedan that contained an Nvidia Tegra K1 mobile processor in the trunk that runs semi-autonomous piloted driving functions. Audi’s Traffic Jam Assist system, for example, can take over driving duties in heavy highway traffic moving slower than 37 mph (60 km/h). The car maker hopes to introduce such technology in its products by 2017.
One chip to rule them all
The Tegra K1 visual computing module, which is a processor with a “supercomputer architecture,” is likely to be in Google’s Android Auto/Auto Link system. The K1 chipset includes a quad-core ARM CPU, a Kepler-class graphics processing unit, 192 of the company’s CUDA cores, as well as dedicated audio, video and image processors, Shapiro explained.
“You might ask what a company that makes graphics and gaming processors is doing providing chips for automotive safety functions,” he said. "Well, it turns out that visual computing power is key not only for photorealistic digital dashboard displays and in-vehicle infotainment systems, but also for building the 3D maps that allow sensor systems to recognize what’s out there for driver-assistance safety functions and, eventually, autonomous driving capabilities.”
Based on incoming sensor data, the low-power (“energy-sipping, only a couple of watts”) mobile processor enables creation of the 3D models in real time to track stationary and moving objects such as other cars, traffic lights, pedestrians, or even a stray soccer ball in the road ahead. It then identifies those objects and determines if they might have an effect on the next decision that the car needs to make.
“In today’s cars, each sensor system has its own platform,” Shapiro said. “We’re soon going to see a consolidation, with several driver-assistance processors being replaced by one Tegra K1.”
Car computers, he continued, “will be more powerful than any computer you’ve ever owned.” In the relatively near future, connectivity, navigation, and safety functions will feed cars with gigabytes of data per second, a load that will require considerable computing power. “In time, the system will be layering radar, LIDAR (time-of-flight ranging), and camera data on top of each other—sensor fusion—to provide the redundancy that’s needed to boost reliability and eliminate false positives.”
After noting that standardized, easily expanded, modular plug-in hardware is critical for progress in this area, Shapiro asserted that OEMs and Tier 1 suppliers will be able to differentiate their brands by developing their own distinctive HMI (human-machine interface) software in-house.
Software upgrades
Automotive computers, he added, “will run multiple applications, many of which might not even exist when you buy the car.” In the cockpit, for instance, gaze-tracking and other sensors may monitor the driver’s degree of attention to the task at hand and then determine how to respond.
In addition, “the software is going to get better during the life of the car,” the Nvidia executive said. “There’s no reason not to update the software algorithms much like you would with your phone and tablet. And your car is also going to get smarter by applying machine-learning techniques to train it,” expanding its capabilities not only during development and testing in the lab and on the track, but also after purchase when the system will gradually learn the owner’s likes and tendencies.
“The system will need to know what a red light is, for example,” he said. “And once it learns to identify them, it’ll have to learn priorities such as what to do if the traffic light turns green and a pedestrian is walking in front of car.”
Today, it’s no longer a question of if but when ‘no hands’ driving will arrive. “It’ll be a slow integration process,” Shapiro predicted, depending on government regulations, customer acceptance, and so forth. “But I believe that growing public awareness of autonomous navigation technology may drive changes faster than we might think.”
Continue reading »