“To get that product to market the things that become important overall—especially from the ADAS perspective—are to make sure that the system operates properly in a variety of conditions and to make sure it’s maintainable. It needs to do what it’s supposed to do, and not do things that aren’t expected, like giving false alerts or braking when it shouldn’t,” said Bendix Commercial Vehicle Systems’ Fred Andersky, Director of Marketing and Customer Solutions, Controls.
Not only does that mean testing in all weather conditions but getting the technology into the hands of real fleets and drivers to put it through real operational protocols. “And you have to be able to test and review the maintenance aspects: You want to make sure if you’re fixing the radar, for instance, that you don’t inadvertently affect the operation of another interconnected system, because as systems become more and more complex, changes that you make to one may have impacts on others,” Andersky noted.
The supplier and developer uses both physical testing and simulation to prepare products for market. “We have a strong, what we call hardware-in-the-loop (HIL) approach—where we can actually have different parameter settings on a computer and actual equipment that’s built into this hardware-in-the-loop so we can see how the system will respond and what will happen in that response,” Andersky told Truck & Off-Highway Engineering Magazine. “That allows us to fine-tune the parameter setting and we can get a lot of that done up front.”
Logging the milesTo physically test variety parameters, testing moves to the track followed by additional fine tuning. Mileage accumulation comes next, where the corporate test fleet gets 10,000 to 30,000 miles to see how the system responds. Once the company reaches a point of comfort from a mileage-accumulation standpoint, Bendix takes it to fleet customers to test in real-world conditions.
At this stage of testing, Andersky said they look to get into the 100,000-mile (or so) range across a couple of different customer fleets and operating parameters in a wide variety of environmental conditions. As Bendix runs this regimen, it also is performing additional testing and evaluation, which might involve not only on-road trials and on-track simulation testing, but also winter test, which is done at Keweenaw Research Center (KRC) in Houghton, Mich., a part of Michigan Tech. The components themselves are tested to ensure nothing breaks and to validate winter performance in different conditions (using salt and humidity baths, for example).
Following a successful fleet trial, when Bendix is ready for product release “it becomes important to keep in mind that we’re not just releasing a product. Because we’re a Tier 1 supplier, the OEM is going to take our product and do their level of testing in their vehicles, in their production operations,” Andersky explained.
Bendix also performs simulation. “Oftentimes when we’re developing new systems, we’re setting parameters for how the system’s going to work,” Andersky said. “Back in the old days, we used to do it [testing] where you’d try a parameter setting: you’d run it, get results, and then you would change one of the 25 parameters; see what that does.” Simulation obviously brings significant time and cost efficiency to that practice.
The testing process varies and can depend on the component or system. Some product development can be relatively quick, especially if the company is building on a foundation of a previous product. But using the example of Bendix’s Stability Control (launched in 2005), Andersky said it took a lot longer in terms of testing.
For global supplier dSPACE Inc., comprehensive is also the word when it comes to ADAS testing. The company offers tools to develop and test embedded software for ADAS and autonomous driving systems to its OEM, Tier 1 and Tier 2 suppliers. “These solutions are very wide ranging across the development V-cycle, as well as very detailed to provide a comprehensive toolchain,” said Mahendra Muli, Director – Marketing & New Business Development for dSPACE Inc.
The company has simulation platforms for model-in-the-loop and software-in-the-loop (MIL/SIL) testing with vehicle models, environment and traffic models and sensor models, as well as HIL platforms for later stages of development. “These tools are backed by a data management backbone so that the artifacts, such as models and test scripts, can be reused throughout the development process. The test setups for some systems, like steering, can be quite complicated and large,” Muli said.
In the recent years, dSPACE also introduced tools to enable the testing of sensors and sensor-fusion algorithms “that are at the heart of the autonomous systems,” Muli said. These tools include driving environment simulation models, complete with roads, pedestrians, buildings etc., and the interaction of sensors within this environment through sensor simulation models so that the information, as sensed by a sensor, can be used for validating software behavior.
“This methodology can be used purely in the software environment during the early stages of development and with the ECU hardware in the later stages,” Muli noted. These tools provide the basis for global development and testing of these advanced technologies across the automotive and commercial-vehicle industries.
The products used in the development of the autonomous systems are comprehensively tested prior to release. “Our tools are often used in the highest level of software criticality as rated by Automotive Safety Integrity Levels (ASIL),“ Muli said. “Our innovation has to be rapid though. Given the pace of technology, particularly in the autonomous driving domain, we have to be very agile and fast in our development.”
The company works closely with its customer base to capture requirements early and offer solutions in an agile manner by placing its development teams close to early drivers of innovation. “This gives us massive leverage and puts us ahead of our competition to offer highly matured technology that meets critical requirements,” he said.
dSPACE’s hardware and software products provide completely open and customizable platforms that can be programmed for any applications. “The same applies for our MotionDesk tool that is used for 3D animation and visualization of ADAS/AD applications, alongside our PC-based, real-time simulation platforms, which can be used in both passenger and commercial vehicle applications,” Muli said.
Evolution to autonomyIn the early phases of commercially-deployable autonomous driving systems, “there is a lot of development and testing that is yet to take place,” Muli told TOHE. Early prototypes of the systems that are demonstrated currently are being validated mostly by on-road testing. “However, it is commonly acknowledged that billions of miles of driving and millions of scenarios would be required to have confidence in the systems to be released in a commercially-viable manner. This is a huge testing task,” he noted.
On-road testing alone is not enough. Lab-based, comprehensive simulation capabilities that can scale widely will be required. “A large database of scenarios, with standard definitions of scenarios, would be necessary,” Muli offered. “It would be equally important to provide a good database solution to handle such large-scale testing artifacts. The requirements are continuously evolving and so are the toolchains. This work will continue to take shape over the next years as automated driving technology matures.”
Bendix’s approach to automated systems is maybe a bit different than others. “While some companies, such as Waymo and Uber, want to go right to a Level 4 and Level 5 autonomous approach, taking a more revolutionary approach, we think it needs to be a little bit more evolutionary and that really applies to proving that the technology is going to work all the way up the ladder by developing what we call a stepping stone approach—each step builds on driver-assistance technologies,” Andersky said.
Future testingIt is important to find efficient ways for developing ADAS technology, Muli said. “The cost and technical talent requirement is huge and this has resulted in many partnerships among industry players. Evolving toolchains for testing have to provide a means to make the testing process efficient, repeatable, reusable and standards-based.”
Andersky said he does not anticipate big changes in testing for autonomy. “From the testing perspective, I don’t see a lot of change in the basic approach of really making sure that we’re validating the performance of the system across the variety of operational considerations, environmental considerations, traffic considerations—and if we’ve got a strong foundation driver assistance, getting to those next levels is an iterative change, as opposed to a major change.” Continue reading »