"We are at the edge of new technology," said Christian Schumacher, Continental's Director of Engineering Systems & Technology, NAFTA to AEI in a recent interview. He was describing Continental’s automated driving test bed designed to provide in-depth data on future autonomous driving. Its goals are to produce a safer experience and reduce driver fatigue.
Recognizing that human error remains a significant contributor to accidents, developing advanced driver-assistance systems for safety makes sense. Technology for this latest project came from the U.S. Department of Defense's DARPA (Defense Advanced Research Projects Agency) Urban challenge of 2007, and the EU-funded HAVEit (Highly Automated VEhicles for intelligent transport) provided the knowledge and experience base, according to the company.
Sensors only, please
What is perhaps significant about this study is that it uses only onboard sensors. There is no V2V or V2I (vehicle-to-vehicle or vehicle-to-infrastructure) technology involved. Nor are customized or expensive, non-automotive grade sensors. "It is amazing what we can get from current automotive sensors. Really, current sensors are good enough," said Schumacher.
The system used in the test uses four short-range radar sensors (two each in front and rear), one long-range radar, and a stereo camera. The stereo camera is Continental’s MFC300, with a 53º by 30º field-of-view that sports a baseline of 220 mm (8.7 in) between cameras.
According to Schumacher, the stereo system will detect pedestrians up to 40 m (131 ft) away at night (with headlamps) and up to 80 m (262 ft) away in daylight. The front long-range radar is an ARS 300, 77-GHz unit with two fields-of-view—56° short and 17° long—and two ranges—short range of 60 m (197 ft) and long range of 200 m (656 ft). The side radars are reported by Continental as 24-GHz BSD units, each with a field-of-view of 150º and a range of 8 m (26 ft). The sensors are integrated in a system that includes electronically controlled braking and electric power steering systems.
According to Continental, if another vehicle enters the sensors’ field of view, the stereo camera captures this vehicle’s immediate surroundings. It then either brakes to adapt to traffic flow or steers the vehicle out of the lane to avoid a potentially hazardous situation. The highly automated system is designed to keep a driver active behind the wheel, unlike a completely autonomous vehicle.
The company also made a point of driving in traffic jam scenarios. They found that, in situations which exceeded the system's capabilities for automated driving—where road markings could not be detected or if bends were too tight, the system switched itself off. The driver had to resume control of the vehicle or the vehicle’s speed was gradually reduced until it came to a stop. (See how V2X is also an option in Networked cars put to traffic jam test in Germany)
Sensor fusion development link to future
Continental reports that the system has completed at least 6000 mi (9660 km) of highly automated driving on public roads of a planned 10,000-mi (6200-km) test in Nevada. Why Nevada? The state passed legislation in 2011 that allows a driver’s license endorsement for the operation of an autonomous vehicle on its highways. The law defines an “autonomous vehicle” to mean a motor vehicle that uses artificial intelligence, sensors, and global positioning system coordinates to drive itself without the active intervention of a human operator. To legally register one of these vehicles in Nevada, the state requires that 10,000 mi of testing be done in autonomous mode, according to Continental.
Truly autonomous driving is still in the future, admitted Schumacher: "For the foreseeable future, I can see semi-autonomous driving useful in low speed driving," he said. “Next [in progression of development] would be highway-only driving, perhaps in separate lanes set aside for that purpose. It is easier for today’s technology to lock onto a highway, still with a driver in the loop.”
He emphasizes that he does not predict unattended driving in the near term, but this advanced form of semi-autonomous driver’s assist is probable.
What are the remaining technical challenges? "We need to work on integrating and creating a fuller view of the environment," said Schumacher. "Sensor fusion is a key requirement and we need to get that right."