What specific sensor types will comprise the advanced driver-assistance systems (ADAS) of the 2020s? That’s a controversial subject among engineers who are developing SAE Level 2 and 3 (and the so-called “L2+”) ADAS sensing suites for new vehicles. Many of them believe that visible-light cameras fused with radar will suffice to deliver the object-identification accuracy, redundancy—and cost effectiveness—that OEMs and the driving public expect of ADAS-equipped vehicles.
But a case is building for additional sensing capability, particularly for automatic emergency braking (AEB) and pedestrian-detection functions. Those safety-critical features currently rely on camera-radar inputs to “see” ahead. They enable the vehicle to react to a range of scenarios—from stalled traffic on a highway to humans and animals suddenly appearing in the road. (In Michigan alone, there were 53,464 traffic accidents involving deer in 2018, up 14% 2016, according to state DoT data.)
Pedestrian fatalities in the U.S. are alarming: one dies every 88 minutes, on average, in traffic crashes. In 2018, 6,283 pedestrian lives were lost, up from 5,977 in 2017. That’s the most since 1990 and represents an increase of more than 35% since 2008. And three-quarters of all pedestrian fatalities occur after sunset, NHTSA reports. Detecting people and critters without fail in rain, snow, fog and darkness can be daunting for systems solely based on optical-and-radar-fused sensing.
In October 2019, the American Automobile Assoc. tested several production AEB systems in various scenarios. During daylight tests, the test vehicle driving 20 mph (32 kph) struck the target 60% of the time. In nighttime conditions, the vehicle under test traveling at 25 mph (40 kph) hit the soft pedestrian target 100% of the time.
“If NHTSA rules truly 100 percent all-weather performance for pedestrian detection by 2021, to meet its 5-Star safety criteria, the industry will have to adjust its ADAS and AV strategies,” said Raz Peleg, sales director at AdaSky, an Israel-based developer of thermal-imaging systems. “In some places there is bad driving weather for half the year,” he noted. For those reasons, thermal-imaging technology is under consideration for both ADAS and SAE Level 4 self-driving AV applications.
A family of thermal sensors used in automotive, known as Far Infrared (FIR), use a long wavelength range outside the visible-light spectrum, to “see” the relative intensities of heat (infrared) energy being emitted or reflected from an object—including buildings, parked vehicles and pavement. (The infrared spectrum consists of a near infrared section (NIR), with wavelengths of 0.75-1.0 μm; a short infrared section (SWIR) with wavelengths of 1-3.0 μm; a mid-infrared section (MIR), with wavelengths of 3.0-5.0 μm and the far-infrared (FIR) section, with wavelengths of 7.5-14.0 μm.
Anything that generates or contains heat can be detected and classified with thermal imaging. Humans and animals have unique heat signatures that can be detected only with a thermal camera. The hotter the object (such as a parked vehicle’s engine as it sits idling, or after it’s been shut off), the more they stand out against the background. Thermal sensors are particularly effective when used in less than ideal lighting situations and are outstanding in smoke and darkness.
Identifying the thermal radiation emitted, FIRs can detect objects at distances well beyond those of conventional headlamps, says one OEM engineer. “Object detection for L2 through L5 must have 100 percent fidelity in all conditions,” he told Automotive Engineering in an email not for attribution. “No company can apologize for its cameras and radars not seeing a child stepping out from behind a parked vehicle in dense rain, for example. For that reason, we’re very interested in thermal imaging.”
In AdaSky’s testing, the company’s latest Viper sensor has recognized pedestrians at distances greater than 300 yards (275 m), or about twice the range of low-beam headlamps. The latest automotive-grade thermal sensors made by FLIR Systems offer similar object-recognition capability, claims technical project manager Kelsey Judd. Considered the mobility industry’s incumbent, Oregon-based FLIR has been selling thermal cameras into automotive—mostly as a driver-warning aid—since 2002 through a partnership with Veoneer (formerly Autoliv). FLIR has a production contract with an unnamed OEM through Veoneer, slated for 2021, on an SAE Level 4 autonomous production vehicle, according to the company.
Foresight, another Israeli company in this space, offers a binocular sensor array that fuses thermal images with those from the visual spectrum to produce a clear 3D view, the company says. Its stereoscopic vision technology uses two synchronized cameras to generate a depth map, enabling extremely precise object detection. Evolved from military sensor tech, FIRs use deep learning and machine-vision algorithms. They are a necessary complement to cameras and radar for ADAS as well as for the camera/radar/lidar “triad” for SAE Level 4 self-driving AVs, maintains Peleg, a former F-16 pilot for the Israeli Defense Force.
“Our system can be fused with those other sensors. It’s an essential combination; our system can’t read street signs, for example, while cameras can,” he said. “But in low-visibility weather, and in corner cases such as with sun blinding and oncoming headlamps and exiting tunnels, thermal imaging does much better—and we do it uninterrupted, 24 hours a day.”
He reports that an Italian high-performance car maker and a North American OEM “well versed in trucks” will enter production by 2021. And both he and Judd draw distinctions between the thermal-imaging technology being readied for series production in automated and autonomous vehicle systems and those used by some OEMs as night-vision aids. “Those [night vision] display their information on a head-up display for the driver to see,” Peleg said. “Our image, of higher detail quality, is not displayed to the driver. Instead, it’s shown to a software layer which interfaces the vehicle’s subsystems. As with AEB, the driver doesn’t see anything; the vehicle only reacts.”
Demonstration drives of AV sensor systems are best conducted in bad weather, for the best real-world evaluation of system performance. Our drive in a Ford Fusion kitted with a roof-mount AdaSky Viper and data-acquisition hardware was in a perfect confluence of rain, drizzle and fog. Peleg piloted the car while we watched a dash-mounted screen. It displayed what the car’s software was seeing: heat signatures of everything in an urban setting during lunch hour.
The images we captured from the demo car’s screen show how acutely the thermal sensor picked up pedestrians from both within and outside the road at a distance, including those at crosswalks and far along the sidewalk. Traffic, pedestrians and commercial delivery drivers criss-crossing the wet roads were white “ghosts” against a grey, cluttered background.
Sensor performance continues to improve. The current-gen AdaSky Viper FIR offers 640 by 480 pixels resolution at refresh rates of 60 frames per second. It can detect delta-T as small as 0.05 degrees kelvin, making it capable of classifying road-surface conditions ahead of the vehicle. FLIR’s Judd describes the same sensitivity of his company’s sensors as “the difference in temperature between your middle and index fingers; it can ‘see’ that easily. Determining whether the object is a human, a deer, or a dog, the computer doing the analytics makes those decisions. It looks at temperature from our sensor’s data and at visible color, shape, movement patterns and other inputs.”
In December 2019, FLIR Systems and ANSYS announced a partnership to integrate a fully physics-based thermal sensor into ANSYS’ driving simulator to model, test and validate thermal camera designs within what the companies call “an ultra-realistic virtual world.” Their aim is to reduce OEM development time in optimizing thermal cameras for use with AEB and pedestrian detection.
On differentiating a 98.6-degree human walking across the road on an equally hot summer day in Arizona, the experts agree: it can be a challenge to separate the two. It’s why multiple sensor types are necessary, they said. Peleg said AdaSky’s image sensing processors (ISP) are accurate in reading an object’s emissivity. “We have patents on this chip, which is manufactured for us by STMicroelectronics, same ones doing it for [machine-vision specialist] Mobileye.” Such chips draw only 750 milliamps, he added.
Not all of the industry is convinced thermal-imaging sensing is ready—and right—for near-term ADAS deployment. “Although it does fill a ‘white space’ among the current sensor technologies, I think we serve a very good range with our camera, radar and lidar,” said Marcus Christensen, North America customer chief engineer at Continental Automotive, when as asked about his company’s plans in this area.
“Thermal sensing is ideal for detecting objects behind the vehicle—in fact, I think rear-facing thermal imaging is a viable use case,” observed Aaron Jefferson, VP product strategy, global electronics, at ZF. “It will definitely be needed for Level 4. And there might be specialized use cases, but at low volume particularly for Level 2. At the moment, however, it doesn’t enhance Level 2 or Level 2+ capability, where the driver’s always responsible, to justify its cost at the moment. It’s a hardware and functionality cost leap.”
Thermal + Radar AEB testing at ACM
To develop a proof-of-concept automatic pedestrian detection system that fuses radar and thermal camera data and can estimate the distance of a pedestrian from the front of a test vehicle, FLIR Systems contracted VSI Labs. The test vehicle was programmed to automatically stop when the dummy pedestrian was at a proximity the system determined to be an emergency-stop distance.
Initial tests were completed in December 2019 at the American Center for Mobility (ACM) near Detroit. The test design was based on Euro NCAP, but not all testing requirements were met. Weather during the testing period was colder than the specified testing temperature range and wet and slick snow-covered roadways and wind interfered with the test fixtures.
Three test cases were conducted in both daylight and darkness, giving six datasets and 35 total test runs using an adult Euro NCAP Pedestrian Target (EPTa):
1. EPTa stationary in the middle of the test vehicle’s lane in Car-to-Pedestrian Longitudinal Adult 50% (CPLA-50) tests.
2. EPTa crossed in front of the vehicle from the roadside in Car-to-Pedestrian Far Side Adult 50% (CPFA-50) tests.
3. EPTa crossed in front of the vehicle from an obstructed position in Car-to-Pedestrian Far Side Adult Obstructed 50% (CPFAO-50) tests.
Test results were promising, according to VSI. In all runs for all test cases, the car’s AEB system successfully brought it to a stop before impacting the EPTa. Additional testing is planned for spring/summer 2020 following AEB algorithm optimization, EPTa heating improvements and when weather is within test parameters.Continue reading »