Autonomous vehicles—so-called driverless cars—have been in the news a lot lately. What began last decade with the U.S. Department of Defense's Defense Advanced Research Projects Agency (DARPA) pioneering robo-car races in the desert has evolved into Google’s self-driving cars safely motoring (with supervision) some 300,000 mi (480,000 km) on city streets and freeways, not to mention the introduction of new autonomous research vehicles from Audi and Lexus at the recent Consumer Electronics Association's Consumer Electronics Show.
Engineers at prospective suppliers of such systems such as Bosch, Continental, Delphi, and Denso readily affirm that most of the technology necessary for autonomous vehicles already exists in one form or another, with many of the basic sensor elements already installed on high-end car models. The remaining, still-developmental devices and systems, even the "near-fail-safe" smart controls—multiple redundant software algorithms that contain thousands of lines of critical code—seem to be coming together and are now starting to appear on new industry and university (such as Stanford University) test-bed vehicles, they said.
One example is Continental’s Automated Driving technology prototype, which recently visited the Chicago Auto Show. This modified 2011 Volkswagen Passat, which is equipped with off-the-shelf sensors found in current luxury production vehicles, has logged more than 15,000 automated mi (24,000 km), partly to qualify for the second "robot driving license" issued by the state of Nevada.
Another instance is the Oxford RobotCar UK project, which just exhibited a robotic self-driving system in a Nissan Leaf electric vehicle. The system, said the Oxford team, is intended to serve more as a respite from a regular daily commute than to completely take over driving. They stated that it was working on a low-cost "auto drive" navigation system that doesn’t depend on GPS and that relies on off-the-shelf sensors that are getting cheaper all the time. The goal is a system that costs only $150.
Weeks previously at a packed CES press conference, Ricky Hudi, Audi’s head of electronics development, displayed a fist-size laser scanner that is designed to scan ahead and to the sides of a vehicle. He noted that the compact sensor is cheaper than the prominent, roof-mounted Velodyne 3D-laser scanners—the "Sauron’s Eye" towers—seen on most existing test-beds. Hudi also showed the crowd a prototype circuit board that is to replace the computers in “the central driver assistance unit" that now takes up space inside Audi’s current autonomy research vehicles.
Fast progress, but no Knight Rider
Such seemingly rapid progress might lead readers to think that robotic cars like those seen in Knight Rider, I, Robot, or even Batman might be just around the corner, but don’t rush out to the dealership just yet. Much remains to be done before robo-cars become reality. Besides miniaturization, sterling reliability, and better affordability of the hardware and software, various social and cultural issues must also be addressed. Doubts linger, for instance, regarding whether motorists are really ready to let go of the driving wheel. For now, the automakers are working on making cars easier to drive, more user-friendly, and above all, safer.
“Pretty much everybody thinks that there will be autonomous vehicles some day,” said Toyota’s Corporate Manager for North American Business Strategy Jim Pisz. “But first there has to be a willingness by society to accept it, and we’re not there yet.”
When you ask Pisz about the giant Japanese auto maker’s efforts to develop autonomous, self-driving cars, he turns the discussion to safety, pointing out the company’s Advanced Active Safety Research Vehicle (AASRV). “Our purpose in this is all about developing safer systems,” he said.
Despite all the recent efforts to robotize the human driver’s functions, “our philosophy is that the driver is always in control,” Pisz said. “We expect the driver to maintain oversight in all traffic situations.” Autonomy, he explained, does not equate with hands-off self-driving, at least in the initial stages of the technology. “We always believe that these systems should assist the driver to make better decisions in a highly complex environment. The system intervenes only when necessary.”
Toyota takes a layered approach to autonomous driving, he continued. “Our approach is to lay on the building blocks incrementally with each new model in an effort to move toward products that consumers can get to know and eventually can learn to trust in, which is hard.”
Only part way there
Take Toyota’s pre-collision system (PCS) as an example, Pisz said. Ten years ago, the 2003 Lexus RX just included millimeter radar. “If you looked at the LS 460 at CES, it has layers of technology in there.” The LS 460 sedan retains a millimeter radar that beams out through the front logo, he noted, but it also has near-infrared light projectors in the headlight cluster to enable the detection of warm bodies from a greater range and in the dark, as well as a pair of stereo high-definition cameras inside near the rearview mirror.
The LS 460’s control algorithm, he said, has enough smarts to automatically brake if it detects something in the road—stopping fully if the car’s moving slower that 25 mph (40 km/h) or to mitigate damage from an impending collision. Through “uniquely OEM-based integration efforts, the controller is also connected to other systems of the car,” Pisz said. “For example, when the system decides to brake, it simultaneously stiffens the suspension, so the car won’t nose dive.” Plus, the steering engineers have gotten involved, providing faster steering response to enhance emergency avoidance maneuvers.
The AASRV, an LS 600h equipped with an array of sensors and an Advanced PCS to observe, process, and respond to the vehicle’s surroundings, has been under development in various forms at Toyota’s Ann Arbor facility, TRINA, since 2008.
On the roof is a Velodyne 360-degree light detection and ranging (LIDAR) laser scanner (reportedly costing around $70,000) that locates objects out to about 70 m (230 ft). Radars on the vehicle’s front and sides measure the position and speed of objects to create, for example, a comprehensive field of vision at intersections. Three high-definition color cameras see objects “beyond the reach of the brightest headlights,” he said. The front camera can tell a red from a green traffic light, while the side imagers detect approaching vehicles and help estimate their trajectories.
All this is linked by the databus network that constantly feeds situational information to a control algorithm that perceives, processes, decides what to do, and then implements it. “All of those things must now act together in real time to image and analyze the scene, determine its contents, and interpret it to discover whether anything may go wrong and respond correctly to its surroundings.”
Pisz stressed that Toyota believes that autonomous cars will need to “talk” to each other and to the road around them using vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) systems. The company recently opened in Japan a 8.6-acre (3.5-ha) urban-streetscape proving grounds that allows simulation of real-life traffic situations with other vehicles, pedestrians, and control devices.
“It’s unlikely that states and local governments will be [able to] afford beacons on every street corner,” he said. “So here it’ll be up to the car manufacturers to make sure that their cars can talk to one another” via short-range wireless links.
Pisz warned that the car industry must be careful not to move too quickly on autonomous driving because of the critical nature of any adoption process: “Taking your hands off the wheel and feet off the pedals presents a different kind of dynamic, a feeling that’s both joyful and scary at the same time.”