Audi details piloted driving technology

  • 01-Apr-2015 11:29 EDT
aetrnvidiapiloted.jpg

Audi plans to reduce the hefty electronic system used in its piloted vehicle to a single board.

Before autonomous vehicles make drivers obsolete, electronic technologies will depend on people to make decisions when something unusual happens. During normal driving conditions, autonomous controls could pilot the vehicle, relying on humans when complex decisions are required.

Audi recently provided technical insight into its piloted vehicle project, in which an Audi A7 concept car drove from San Francisco to Las Vegas earlier this year. The vehicle drove itself most of the journey, though drivers had to remain alert to take over when alerts directed them to resume driving.

The concept car has a range of computers in the trunk. Audi engineers plan to reduce them to a single board over time. The mainstays of the piloted vehicle technologies are an array of cameras, radar, and ultrasonic sensors that are controlled by what’s called the zFAS board. It combines sensor inputs to give the car its view of the world.

“All raw signals from the sensors is collected in a sensor fusion box,” Matthias Rudolph, Head of Architecture Driver Assistance Systems at Audi AG said during the recent Nvidia GPU Technology Conference. “From that input, a virtual environment is created.”

Four semiconductors are the basis of the zFAS board. An Nvidia k1 processor collects data from four cameras and “does everything while driving at low speeds,” Rudolph said. An Infineon Aurix processor handles additional chores. Mobileye’s EyeQ3 performs vision processing, while an Altera Cyclone FPGA (field programmable gate array) performs sensor fusion.

The software architecture is layered, with the perception sensor programs forming the first layer. Above that, there’s a fusion layer that blends data from the sensors with information from maps, road graphs, and other sources. Rudolph noted that combining inputs provides better information and increases confidence in the analysis.

“Radar is not good at determining the width of a car,” Rudolph said. “A camera does that well. If we fuse data from each of them we get good information on what’s ahead.”

Ensuring that the zFAS boards detect potential threats and respond to them correctly without false alerts is critical. If vehicles stop or swerve to avoid something that isn’t a true danger, drivers are likely to stop using the system.

“If the car brakes and nothing’s there, it will destroy the confidence of the driver,” Rudolph said. “We have had no false positives; that’s been proven with over 10,000 hours of driving at an average speed of 60 kph (37 mph) in situations including snow and freezing rain.”

Audi looks at moving objects to analyze their potential impact given the vehicle’s driving path and speed. All stationary items are viewed with a single goal.

“We look at static images as the same,” Rudolph said. “It doesn’t matter if it’s a wall or a parked car, we don’t want to hit it.”

Pedestrians are a major challenge for all types of autonomous systems. They’re harder to spot and categorize than vehicles, and they have more degrees of freedom. The system uses a single monocular camera to search for pedestrians. Given the erratic behavior of some walkers, Audi doesn’t stop for pedestrians unless they’re truly in harm’s way.

“When we detect pedestrians, we compute the time to contact,” Rudolph said. “We’re close when the vehicle stops. We want to be close, just a few centimeters away. We do not want to stop far away.”

Though the piloted system aims to avoid pedestrians and most everything else, Audi realizes that collisions can’t always be prevented.

“If we can’t avoid an accident, we steer to use the structure of the car to minimize the chance of injury,” Rudolph said.

Such an action would occur mainly when the human driver didn’t take over in time to avoid a collision. Audi uses an LED alert system to tell drivers when they need to take charge. They can do that by hitting the brakes or making a sharp steering wheel movement. An internal-looking camera watches drivers so the system knows whether the LED alert needs to be augmented with an audible warning.

“In the piloted driving mode, we may need to get the driver back, so we need to know what he’s doing,” Rudolph said.

Share
HTML for Linking to Page
Page URL
Grade
Rate It
4.67 Avg. Rating

Read More Articles On

2016-07-28
Car infotainment and audio accessory supplier Blaupunkt has renewed its license agreement for the brand in India. The agreement has been signed between brand holder Global Intellectual Property Development and Blaupunkt India Pvt.
2016-07-17
The material—hot deformed neodymium—is being used first in a new permanent-magnet traction motor powering Honda’s 2017 Freed Sport Hybrid compact minivan. Significantly, the material is not a “heavy” rare-earth metal, one that requires “doping” with dysprosium or terium rare earths to achieve high heat-resistance characteristics.
2016-08-16
Get ready for the “pod” car: Ford announced today that it intends to mass-produce a driverless, fully autonomous vehicle for commercial ride-sharing or ride-hailing service in 2021. The vehicle apparently will be the embodiment of today’s notion of the driverless-vehicle extreme: it will have no steering wheel, brake or accelerator pedals, said president and CEO Mark Fields during the announcement at the company’s Palo Alto, CA, research campus.
2016-08-11
The new SLN5 molded current sense resistor line from KOA Speer Electronics features a 7-watt power rating in 4527 size.

Related Items

Article
2016-09-06