This article also appears in
Subscribe now »

Delphi demonstration vehicle fitted with the company's latest automated-driving technology under test in Pittsburgh, home to the company's Ottomatika driving-software engineering unit.

Delphi showcasing next-gen autonomous system at CES 2017

Delphi will use CES 2017 to demonstrate an advanced new automated-driving technology platform the company plans to make available to automakers by 2019 as a complete system to enable vehicles to operate with SAE Level 4-5 fully autonomous capability.

The Delphi technology platform, called Central Sensing Localization and Planning (CSLP), leverages the latest supercomputing microprocessor chip from Intel, as well as a new sensor-fusion processor and tri-focal camera hardware from Israel-based machine-vision specialist Mobileye.

But apart from the pure microprocessing power Delphi’s assembled for the CSLP system, perhaps its most unique advance is new, Mobileye-developed software called Road Experience Management. REM provides the vehicle with crowd-sourced information to create an ultra-precise, real-time map that Mobileye said “is a prerequisite for safe autonomous driving.”

The Las Vegas demonstration during CES 2017 is on a 6.3-mi (10.1-km) course comprised of public roads that Delphi claimed is “the most complex automated drive ever publicly demonstrated on an urban and highway combined route.”

At a media information background session prior to the CES unveiling of the CSLP system, Glen De Vos, vice-president of services for Delphi, enthused that the CES demonstration is “the first time we will showcase this (combined Delphi, Mobileye and Intel technology) together. We couldn’t be more excited.

“Three factors will separate the leader from the pack in the race to offer driverless vehicles by 2019,” De Vos continued in a release: “best-in-class perception sensors such as cameras, radar and LiDAR, automotive experience and computer processing speed.”

De Vos told the media the system, because of its advanced vision-sensing and software, will be less-expensive than others being developed that rely on still-costly LiDAR sensors to generate adequate data about the environment around the vehicle.

He said Delphi projects its turnkey system will cost on the order of $5,000, but that figure is of course expected to plummet as costs are reduced and sales volumes increase. He also said Delphi currently has no “committed customers” for the system.

Assembling the players and technologies

For autonomous-driving development, Delphi will continue its established role as a Tier 1 integrator of technology in order to sell a complete system to automakers, many of which either do have the resources to develop their own automated-driving system or are not inclined to do so, preferring for suppliers to make the investment—and, potentially, to shoulder the early-adoption liabilities.

In August 2016, Delphi announced its partnership with Mobileye to develop the sensor-fusion aspects required for high-level automated driving. A few months later, the company confirmed it had enlisted Intel to supply the advanced processing chipset that will enable trillions of calculations per second.

In 2015, Delphi acquired Ottomatika, an automated-driving software engineering specialist spun off from research at Carnegie Mellon University. It was Ottomatika driving-software algorithms that subsequently helped Delphi achieve a fully autonomous cross-country vehicle trip in 2015.

Not long before its partnership with Delphi, Mobileye, which has contracts with 27 automakers to supply some type of advanced driver-assistance technology, was on the front lines of autonomous driving development’s most notable setback to date: a Tesla car using Mobileye camera vision and software crashed while under autonomous control, killing the driver.

It is Mobileye’s latest vision-sensing hardware and software, however, that is at the center of the Delphi CLSP system’s sensor-fusion and software capabilities—particularly the REM software “overlay” on its EyeQ 4/5 System-on-a-Chip microprocessor. This enables the camera-vision capabilities alone to position the vehicle with a 10-cm (3.9 in) accuracy—even in the absence of a Global Positioning System (GPS) signal.

Dan Galves, senior vice-president and chief communications officer, said of REM, “This really is what Mobileye is offering to the industry.” He added that in internal development, a test vehicle was able to drive autonomously using only camera vision after just four circuits of a congested portion of I-75 near Detroit supplied the necessary visual data for REM.

The REM software is “really in the validation phase right now,” said Galves, who added that Mobileye also has supplied the REM software to EyeQ-based vision systems being used by GM, Nissan and Volkswagen.

Next-gen accuracy

Real-time understanding of the “local” environment with REM is “the key element of automated driving” using Delphi’s new autonomous-technology platform, said De Vos. He explained that the company’s recent automated mobility on demand (AMOD) pilot program in Singapore will adopt the REM-based system. Delphi also plans AMOD demonstration programs in North America and Europe (likely cities in the U.S.: Pittsburgh and Boston).

Ironically, despite the high-powered processing capacity Delphi is building into its CLSP platform with the help of Mobileye and Intel, the REM software itself does not require much memory capacity. The roadside signs, buildings and countless other fixed landmarks throughout the nation that the system employs for a large portion of its localization “knowledge” only needs about 60 GB of data storage.

But at least for now, advanced camera vision and REM isn’t replacing the highly accurate onboard mapping capability available from LiDAR sensing. Delphi’s De Vos said the CES demonstration vehicles will have six electromechanical LiDAR sensors, not to mention radar, to augment vision sensing.

Delphi sees AMOD projects as the likely first candidates for high-level (SAE Level 4-5) autonomous driving. De Vos noted that autonomous buses, mobility “pods” and individual ride-share passenger cars probably will be the best initial deployments for high-level autonomy.

Continue reading »