LiDAR: New “eyes” for vehicle autonomy

Steadily evolving lidar sensor technology will offer significant leaps forward in autonomous capability—once cost is reduced.
The acronym stands for Light Detection and Ranging, and LiDAR first showed its potential in May 1962, when electrical engineers Louis Smullin and Giorgio Fiocca used the 12-inch telescope at MIT’s Lincoln Laboratory to bounce 50-joule laser pulses off the Moon, receiving the reflected light with the lab’s 48-inch telescope. Using time-of-flight calculations, the two researchers were first to measure the distance between Earth and its only natural satellite.
Four decades later, after use in meteorology and the Apollo program, lidar emerged as a vital sensor in robotic-vehicle prototypes competing in the U.S. Department of Defense’s DARPA Challenge. With up to 64 lasers and sensors packed in large, roof-mounted cylinders, the ungainly prototypes resembled spinning soup cans when operating. The sensor units swept a 360° field of view around the vehicle, firing thousands of light pulses. Objects that reflected light within the sweep were identified as a cloud of points, supplementing imaging from the multiple radars and cameras also fitted to guide the driverless vehicles.
By 2007, the annual event’s fourth year, the winning vehicle and five of the six finishers had early lidars from technology pioneer Velodyne Acoustics mounted atop their roofs. Each unit reportedly cost nearly $80,000.
Today’s steadily evolving lidar typically functions in a similar fashion as the early units, albeit with new hardware and software. The proven mechanical-scanning types are in the practical lead for automotive use, experts say, while new solid-state (no moving parts) devices are expected to deliver greater reliability, dependability and a compact form factor—crucial for integrating the unit within the car’s exterior skin. Solid-state types typically have a more limited field-of-view (FOV). Their lower cost, however, offers the possibility to employ multiple sensors to cover a broader area.

Depending on vehicle’s exterior geometries determined by styling, a typical car or truck could need up to eight compact lidar units of narrower acceptance angle—120° on the front and rear, 90° on the side—compared with the big 360° unit on the roof. Range can be reduced; engineers are aiming for 30 meters on the side, 200 meters to the front, 50 meters to the rear, depending on how microwave radar is integrated into the vehicle’s safety suite.
Lidar development is booming, as start-ups and established Tier suppliers race to enable SAE Level 3, 4 and 5 automated-driving capability in future vehicles. The technology is also in big demand for 3D mapping.
Frost & Sullivan forecasts sales of 6 million lidar units in 2025—half of them for use in autonomous vehicles, for a projected $2-billion market. Automotive Engineering counts more than 30 start-ups in the field, along with market leader Velodyne and some Tier 1s including Bosch, Valeo, and Continental. Aptiv and Magna are among top suppliers that are partnering with LiDAR specialists. OEMs continue to acquire and build ties with lidar developers, including BMW (Innoviz via Magna), GM (Strobe), Toyota (Luminar) and Ford (Princeton Lightware and Velodyne).
While some advocates have dubbed lidar “the essential piece of the puzzle for self-driving cars,” it is certainly the most discussed and perhaps controversial sensor technology related to autonomous vehicles.
“Every couple of weeks a new company is touting new lidar technology adaptations,” observed veteran mobility-tech consultant Gerald Conover. “Many of their claims are based on lab-project results, so designs producible in high volume may still be some years away.” Only those that deliver high performance at low cost will survive. Depending on the uptake for autonomous vehicles in SAE Levels 4 and 5, however, “the demand for lidar devices could be significant,” Conover noted.
Cost remains the nagging impediment to the mass deployment of automotive-grade units. Development units still cost $10,000 or more—“not a sustainable number for automotive production,” Conover quipped. “The thing standing in the way of this is the necessary expertise to produce working lidar, which is in the hands of only a few supplier firms.” OEMs eventually expect a steady cost-reduction path to commodity status, similar to those of onboard radar and cameras.
Since late 2017, Quanergy has been producing a 905-nanometer solid-state lidar with a range of 150 meters at 8% reflectivity. An optical phased-array type, it can scan half a million points per second with a spot size of 3.5 cm at 100 meters, the company claims. While single-unit samples are priced in the thousands, Quanergy believes high-volume scale will drive per-unit cost below $300. Velodyne’s lowest-priced 16-laser unit costs $4,000 per unit.
Horses-for-courses tech choices
While the automotive lidar space is white-hot, not all OEMs see the technology as an imperative. Tesla’s Autopilot system uses camera-based optical recognition, and company boss Elon Musk appears unconvinced that lidar is a game-changer. “Once you solve cameras for vision, autonomy is solved; if you don’t solve vision, it’s not solved,” Musk said during a TED Talk in April 2017. “You can absolutely be superhuman with just cameras,” he added. He obliquely labeled lidar “a crutch.”
Honda North America is “doing development and testing with lidar,” noted Jay Joseph, assistant VP of Product Planning. He said Honda engineers believe lidar is necessary in the short term. “Longer-term, of course, we’d like to see other solutions—probably more dependent on connectivity and shared information. But until that’s reliable, lidar is probably necessary to provide good information to the vehicle so it can make good decisions.”
Assembling and integrating the sensor array into the vehicle is an important role that experienced Tier 1s including Aptiv are playing. “We understand how it works with the vehicle; some of the tech start-ups don’t understand vehicles well,” noted Jada Smith, Aptiv’s VP of advanced technology.
Smith said her company wholly believes in the tech triad of cameras, radars and lidar for vehicle autonomy. Lidar, she said, is “a necessary piece of technology, to handle all use cases, provide redundancy, and to help the vehicle see everything going on around it.” Aptiv is covering multiple technology bases with its investments in Leddartech (flash-lidar), Innoviz (MEM type) and Quanergy (optical phased array).
Choosing lidar types is a horses-for-courses engineering exercise. “What roles do we expect them to play?” Smith asked. “The longer the range, the narrower its FOV—same concept as a camera.  Depending on what performance we want, we may choose a flash type for one and MEMS for another. It’s tradeoffs, depending on what we’re trying to accomplish.”
Automotive Engineering recently spent time with a group of lidar innovators and brings the following insights into their backgrounds and technologies.
Innoviz Technologies
Like so many sensor-tech developers, Innoviz is based in Israel and several of its principals have specialized-electronics background with the Israeli Defence Forces. Innoviz was founded three years ago and has about 150 employees globally. The company has garnered more than $80 million in investment funding, including stakes by Aptiv, Magna and Samsung.
The foundation technology is a microelectromechanical systems (MEMS)-based design in which movement of the mirror that projects the scanning lasers comes from a solid-state chip. Critically, Innoviz promotes laser scanning at 905 nm, a long-established wavelength that Aditya Srinivasan, general manager, North America, said allows the company to keep a lid on cost—to the point at which Innoviz can offer its first automotive-grade lidar sensor, InnovizOne, starting in 2019 at a cost in the hundreds of dollars.
There may be some contention about whether the Innoviz design should be defined as solid-state, since the system uses a moving mirror, but, “Rightly or wrongly, we’re calling this ‘solid-state,’” said Srinivasan.
The InnovizOne is “designed for seamless and easy integration into any mass-market vehicle,” the company’s literature describes. The system delivers 120°-horizontal and 25°-vertical FOV for high-definition resolution of 7.5 million pixels/sec; the frame rate is 25 frames/sec. Claimed range is up to 250 m (820 ft). The unit’s footprint measures 50 mm high by 110 mm wide by 100 mm deep (2 x 4.3 x 3.9 in.). Data is managed by a proprietary signal-processing chip that’s been developed with “a partner” that Srinivasan chooses not to name.
The company also claims its system is adept at identifying objects with extremely low reflectivity—a performance aspect that to now has been a challenge for many lidar developers.
In April, Innoviz announced a supply agreement with BMW through Innoviz’ supplier partner Magna. BMW said it intends to offer an SAE Level 3 autonomous ride-hailing service in 2021 and Innoviz-derived lidar apparently will be a key component.
Aptiv and Samsung also are partnered with Innoviz for automotive lidar development and in June the company said it formed a partnership with Chinese automotive Tier 1 HiRain Technologies, which supplies several major Chinese automakers and is integrating the components for its own autonomous-driving platform.
The primary differentiation point for TetraVue’s solid-state “flash” lidar technology is high resolution—as well as reliance on long-proven and relatively-inexpensive sensor technology derived from the digital-camera world. In fact, the company refers to its automotive lidar as a “high-definition 4D camera” that essentially fuses mega-pixel digital video capture with lidar for long-range sensing with pixel-level depth information.
Resolution is everything to TetraVue founder and executive VP Paul Banks, who has a Ph.D in applied physics but explains the company’s presumptive technology advantage in plainspoken terms. Banks removes his eyeglasses, saying the state of California would not certify him to drive without them—yet most competing lidar technologies “see” with less resolution than the state’s minimum vision requirement for humans to legally drive.
“For us, that’s what’s important,” Banks said flatly. “High resolution. We actually use the same (image) sensor that’s in your cell phone. We cheat,” he deadpanned.
His argument is a compelling one that’s interested investors such as Bosch and Samsung. Appropriating the well-developed and extremely cost-driven complementary metal oxide semiconductor (CMOS) and charge-coupled device (CCD) sensing technology of digital cameras, TetraVue’s lidar flashes the environment at up to 30 fps with lasers operating at the invisible-to-the-eye 800-nm wavelength. This “illumination” is merged with the high-resolution video-capture to derive depth information at the pixel level.
Banks’ demonstration borders on amazing, as he shows a data scene of a dancer seen with “conventional” lidar and TetraVue’s lidar; the additional perspective and depth from the TetraVue image is patently startling.
“It looks and feels more like a video camera,” Banks said. And he does not exaggerate—showing up to 60 million points per second, the images from the company’s system make the techy-but-still-scratchy visual representations from competitors seem like the visuals from an ancient video game.
The current downside to Vista, California-based TetraVue’s lidar may be a comparative lack of range. The current design, Banks said, has a range of about 150 m (492 ft)—Velodyne’s latest system, for example, boasts a range twice that far. TetraVue’s range may be improved, he said, “but for us, it all comes down to cost.” He said the company is intent on delivering its advanced technology at a price conducive to mass-market application.
Founder and CEO Angus Pacala’s Ouster could be the tech age’s embodiment of the military doctrine of the Civil War’s Nathan Bedford Forrest: “Get there firstest with the mostest.”
Pacala extols one of his company’s market advantages as just that: Ouster is shipping automotive lidars today, he said.
“We let our products do the talking,” he boasted of the “smartest, lightest 360-degree, 3D sensing in the market.” He also said his company is the only one to openly and transparently price its technology for any buyer.
Pacala, formerly the director of engineering at Quanergy, said Ouster’s OS-1 is the highest-resolution lidar currently commercially available, and it has best-in-class power consumption, size and weight. The system measures 1.3 million points per second, yet consumes less than 15W. And like TetraVue, the company’s technology is rooted in comparatively low-cost, highly-developed CMOS technology used for years in ever-advancing smartphones and digital cameras.
To keep costs reasonable, the OS-1 lasers operate on the 850-nm wavelength; cost is “laddered” to some degree according to the customer’s need for channels: the highest-cost versions use 64 emitters to deliver each vertical field-of-view “slice,” while lower-performance requirements can cut cost with just 16 channels. Pacala said the company expected to ship 10,000 to 20,000 units by the end of 2018.
The roughly double-puck-sized OS-1 weighs just 330 g and is 2.5 in (63 mm) tall and 3.14 in (80 mm) in diameter. It is not solid-state: the unit spins to emit over the 360° coverage range and nearly 32° vertical FOV. Its accuracy is about 3 cm (1.3 in)—but range is a comparatively abbreviated 120 m (394 ft).
Range is improved with the coming OS-2, which Ouster indicates will have a 200-m (656 ft) range and 64 channels spaced at 15.8°, although the unit is correspondingly larger and heavier. Pacala said the OS-2 would be available in the third quarter of this year.
French Tier 1 supplier Valeo late in 2017 achieved the distinction of supplying what is believed to be the first lidar sensing system to be deployed on a series-production vehicle, Audi’s A8 sedan, widely described as using SAE Level 3 driver assistance. The A8’s Traffic Jam Pilot system controls the A8’s acceleration, braking and steering at speeds up to 37 mph (60 km/h), using the company’s Scala lidar.
Scala, a solid-state design developed in cooperation with LeddarTech, won a 2018 PACE award for supplier innovation. Valeo said Scala has a 145° horizontal field of view and range of 150 m. As is typical for many sensing technologies, LeddarTech says its advances are largely in proprietary processing and algorithms: “essentially an ensemble of software, algorithms and know-hows that are used to design or optimize various types of solid-state lidar sensors,” according to company literature.
Some anxious early-adopters won’t find for Traffic Jam Pilot and the Valeo/LeddarTech lidar array just yet, however; the system initially is not available in many countries. Audi was reluctant to introduce the technology in the U.S. and other markets that do not have clearer legal and regulatory frameworks to address conditional autonomy.
In May, Valeo announced a “strategic cooperation” with Apollo, the open autonomous driving platform created by China’s Baidu in 2017. The company said in a release it will contribute to the Apollo project its expertise in sensors, not to mention its “skills in sensor cleaning systems and connectivity between autonomous vehicles.” Continue reading »