When Carnegie Mellon University fielded the Boss autonomous vehicle in 2008, the SUV's entire cargo area was crammed with the requisite computing hardware. (Sam Abuelsamid)

The Navigator: Explosion in compute power enables automated driving

Over the course of this summer as I continue to follow all the companies developing automated driving systems, I’ve also been reading a preview copy of the new book from Larry Burns. Penned by the former General Motors R&D chief and co-author Christopher Shulgan, Autonomy: The Quest to Build the Driverless Car is both a personal memoir and a chronicle of the birth of the modern automated vehicle. It’s been more than a decade since I took my first ride in an automated vehicle, the Chevrolet Tahoe dubbed Boss which was built by the Tartan Racing team from Carnegie Mellon University.
 
That full-size SUV was bristling with sensors including that first generation Velodyne lidar on the roof. While it successfully navigated around the test course at the 2007 DARPA Urban Challenge as well as the course in the parking lot at the 2008 Consumer Electronics Show where I first encountered it, Boss didn’t have a lot of practical use in that form. It was a proof of concept that a vehicle could autonomously navigated a relatively complex environment without a human operator, but it was so packed with equipment that it could barely handle a couple of passengers.
 
In particular, the entire cargo area was consumed by a server rack. Today, Bryan Salesky is CEO of Argo AI, the company developing the automated driving system for Ford’s 2021 production AV. In 2007, Salesky led the software development effort for the CMU team. The compute platform Salesky’s team utilized was powered by 10 blade computers with Intel Core 2 Duo processors. Because of the custom setup the team used, it’s hard to say for sure what the total computing capability of that system was but published tests of the Core 2 Duo with the Whetstone benchmark peg the processor at about 180 million instructions per second. Theoretically, a rack of 10 would crunch through 1.8 billion instructions every second.
 
In the intervening years, there has been a marked shift in the way high-performance computing is handled with computer scientists increasingly utilizing highly parallel processors originally designed for video game graphics to run deep neural networks. Over the past five years, Nvidia has become the dominant player in the space with its line of GPU-based Drive PX systems that are relatively affordable and have extensive software development toolkits.
 
As the engineers developing automated driving work to commercialize their systems, the software must get increasingly capable to handle situations that Boss was never meant to deal with like running in snow, fog and dense urban traffic like San Francisco and Singapore. Nvidia’s Drive Xavier system on a chip is a single piece of silicon that integrates multiple different processor architectures to achieve the redundancy that driverless vehicles will need for safe real-world operation. Xavier can churn through 30 trillion operations per second (TOPs), several orders of magnitude more than the system that powered Boss on a board roughly the size of two smartphones side by side. 
 
Intel is still playing in the automated driving space having acquired Mobileye in 2016. Mobileye’s new EyeQ5 chips are expected to begin shipping in sample quantities to automakers in 2019 and Intel plans to use a pair of EyeQ5 chips along with a custom Atom CPU to achieve 48 TOPS. 
 
However, increasingly, engineers are realizing that to achieve Burns’ goal of vehicles that can operate autonomously, anywhere, anytime under all conditions, even more power will be needed. That’s why Nvidia is now shipping samples of Drive Pegasus to companies like Argo for testing. Pegasus combines two Xavier and two next-generation GPUs on a board about the size of a modern laptop and is claimed to hit 320 TOPs.
 
From millions to billions to trillions of operations, all of this performance capability will be critical to achieve the necessary levels of functional safety fail operational capability that will be expected of widely deployed automated vehicles. When a human is no longer expected to supervise and take over when technology fails, cars will need backup virtual drivers that can bring passengers to a safe place when the inevitable goes wrong. Continue reading »
X