Audi details piloted driving technology

  • 01-Apr-2015 11:29 EDT
aetrnvidiapiloted.jpg

Audi plans to reduce the hefty electronic system used in its piloted vehicle to a single board.

Before autonomous vehicles make drivers obsolete, electronic technologies will depend on people to make decisions when something unusual happens. During normal driving conditions, autonomous controls could pilot the vehicle, relying on humans when complex decisions are required.

Audi recently provided technical insight into its piloted vehicle project, in which an Audi A7 concept car drove from San Francisco to Las Vegas earlier this year. The vehicle drove itself most of the journey, though drivers had to remain alert to take over when alerts directed them to resume driving.

The concept car has a range of computers in the trunk. Audi engineers plan to reduce them to a single board over time. The mainstays of the piloted vehicle technologies are an array of cameras, radar, and ultrasonic sensors that are controlled by what’s called the zFAS board. It combines sensor inputs to give the car its view of the world.

“All raw signals from the sensors is collected in a sensor fusion box,” Matthias Rudolph, Head of Architecture Driver Assistance Systems at Audi AG said during the recent Nvidia GPU Technology Conference. “From that input, a virtual environment is created.”

Four semiconductors are the basis of the zFAS board. An Nvidia k1 processor collects data from four cameras and “does everything while driving at low speeds,” Rudolph said. An Infineon Aurix processor handles additional chores. Mobileye’s EyeQ3 performs vision processing, while an Altera Cyclone FPGA (field programmable gate array) performs sensor fusion.

The software architecture is layered, with the perception sensor programs forming the first layer. Above that, there’s a fusion layer that blends data from the sensors with information from maps, road graphs, and other sources. Rudolph noted that combining inputs provides better information and increases confidence in the analysis.

“Radar is not good at determining the width of a car,” Rudolph said. “A camera does that well. If we fuse data from each of them we get good information on what’s ahead.”

Ensuring that the zFAS boards detect potential threats and respond to them correctly without false alerts is critical. If vehicles stop or swerve to avoid something that isn’t a true danger, drivers are likely to stop using the system.

“If the car brakes and nothing’s there, it will destroy the confidence of the driver,” Rudolph said. “We have had no false positives; that’s been proven with over 10,000 hours of driving at an average speed of 60 kph (37 mph) in situations including snow and freezing rain.”

Audi looks at moving objects to analyze their potential impact given the vehicle’s driving path and speed. All stationary items are viewed with a single goal.

“We look at static images as the same,” Rudolph said. “It doesn’t matter if it’s a wall or a parked car, we don’t want to hit it.”

Pedestrians are a major challenge for all types of autonomous systems. They’re harder to spot and categorize than vehicles, and they have more degrees of freedom. The system uses a single monocular camera to search for pedestrians. Given the erratic behavior of some walkers, Audi doesn’t stop for pedestrians unless they’re truly in harm’s way.

“When we detect pedestrians, we compute the time to contact,” Rudolph said. “We’re close when the vehicle stops. We want to be close, just a few centimeters away. We do not want to stop far away.”

Though the piloted system aims to avoid pedestrians and most everything else, Audi realizes that collisions can’t always be prevented.

“If we can’t avoid an accident, we steer to use the structure of the car to minimize the chance of injury,” Rudolph said.

Such an action would occur mainly when the human driver didn’t take over in time to avoid a collision. Audi uses an LED alert system to tell drivers when they need to take charge. They can do that by hitting the brakes or making a sharp steering wheel movement. An internal-looking camera watches drivers so the system knows whether the LED alert needs to be augmented with an audible warning.

“In the piloted driving mode, we may need to get the driver back, so we need to know what he’s doing,” Rudolph said.

Share
HTML for Linking to Page
Page URL
Grade
Rate It
4.67 Avg. Rating

Read More Articles On

2016-04-12
What keeps you up at night? Cybersecurity experts at the SAE World Congress tell all.
2016-04-13
Though battery-powered vehicles haven’t yet met predicted sales levels, leading automakers are bullish about the future, saying that newer generation vehicles now provide features and prices that will attract buyers.
2016-04-13
The nCode brand of durability, test, and analysis software by HBM introduces nCode VibeSys, a new data processing system designed to help acoustics and vibration engineers.
2016-04-20
Ruetz System Solutions provides an interoperability test platform for Automotive Ethernet that constitutes a component of the test set-up for Open Alliance Layer 1 interoperability tests for ECUs.

Related Items

Training / Education
2010-03-15
Technical Paper / Journal Article
2004-03-08
Technical Paper / Journal Article
2004-11-16