Ford and University of Michigan unveil automated-vehicle testbed

  • 19-Dec-2013 04:53 EST
FordFusion LiDAR testbed.JPG

Ford's Hybrid Fusion testbed isn't as cluttered with rooftop appendages as the typical DARPA Challenge vehicle or Google's autonomous car testbed. Note four-sensor Velodyne LiDAR array on roof. (Lindsay Brooke)

Ford recently unveiled a heavily modified Fusion Hybrid that is serving as a testbed for the automaker’s automated-vehicle research. Ford is conducting the program with the University of Michigan, which has the algorithm-development lead, and State Farm Insurance, which is analyzing how driver-assist technologies can lower the rate of rear collisions.

The goal of the program, in which Ford has the lead on systems and vehicle integration, is to advance development of new sensor and control technologies with the aim of applying them to the active safety systems of future vehicles, said Raj Nair, Ford Group Vice President of Global Product Development. Nair stressed that the automaker’s current focus is on automated vehicles—in which the driver is always in control—rather than autonomous self-driving types.

The Fusion Hybrid testbed features a quartet of roof-mounted LiDAR (Light Detection And Ranging) infrared sensors that scan the surroundings. LiDAR measures distance by using a laser to illuminate everything within 200 ft (61 m) of the vehicle and then analyzes the light that is reflected back, similar in principal to radar. The reflected light is combined with GPS data to generate a real-time, high-resolution 3-D map of the area surrounding the vehicle. The map is used to help guide the vehicle.

The rooftop sensor array resembles spinning coffee cans. It is sourced from Velodyne, a major LiDAR hardware supplier. The sensors can track any stationary or moving object that is sufficiently dense to reflect light. The testbed vehicle currently can process data at the rate of approximately 2.8 million bytes per second, said Ed Olson, a University of Michigan professor involved with the program, who co-founded MIT’s Mobile Autonomous Systems Laboratory.

Professor Olson noted that the biggest challenge currently is range; “closing rates on the highway can be in excess of 140 mph,” he explained. Because of this, the LiDAR would likely be used in conjunction with a radar for distances beyond 200 ft.

Radar and LiDAR operate in different spectra. Radar is in the millimeter wavelength range (e.g., 77 GHz radar used in automated cruise control systems has a wavelength of about 4 mm) while LiDAR operating near the visible spectrum has very short wavelengths typically ranging from about 10 µm (390 µin) to around 250 nm (10 µin). Experts note that the short wavelengths are preferred for mapping because they generate more granular detail. ACC, by comparison, needs only range and range-rate.

There is a critical difference: LiDAR images degrade in poor weather. “Rain or snow can put LiDAR out of business. That is why very early ACCs, such as those used by Chrysler, started out with LiDAR but quickly adopted microwave radar because the latter can 'see' through inclement weather,” explained Gerald Conover, Managing Director at PRC Associates, a technology consultancy.

In the current development phase, vehicle-to-vehicle and vehicle-to-infrastructure (V2V and V2I, respectively) capability are not included in development because “they’re not critical for the vehicle to operate safely,” said Dr. Ryan Eustice, Associate Professor at the University of Michigan’s School of Naval Architecture & Marine Engineering, an expert in imaging technologies who has been involved with the Ford automated vehicle program since its 2007 inception.

Dr. Eustice said the range accuracy of the current system is ± 2 cm (0.787 in), controlled by the LiDAR spin rates (now 10 Hz). He said the hardware’s size and form factor are steadily evolving into more compact units. Satellite imaging could potentially be incorporated in future vehicle configurations, he added.

Ford, which is responsible for developing unique components allowing the vehicle to function at high levels of automation, already has technology that enables production vehicles to park themselves, understand a driver's voice commands, detect dangerous driving situations, and assist with emergency braking.

Competitors Mercedes and Mitsubishi have shown ranging devices based on binocular cameras operating in the visual spectrum like human eyes. Mitsubishi’s system is currently available on production cars.

Share
HTML for Linking to Page
Page URL
Grade
Rate It
4.50 Avg. Rating

Read More Articles On

2016-09-06
To speed up noise and vibration data analysis for vehicle manufacturers and industrial machinery, Brüel & Kjær offers RT Pro 7.3 enhanced software for its Photon signal analyzer.
2016-09-20
While OEMs wait for NHTSA's V2X mandate they are discussing whether broad usage can be achieved without regulations.
2016-09-10
The connected car’s emergence is as disruptive for insurance companies as it is for automakers. Usage based insurance (UBI) holds a major role in future plans, prompting insurers to partner with OEMs and create apps that provide services that make UBI more attractive to customers.
2017-03-04
Euro NCAP will establish a separate category for autonomous vehicles, but there is not likely to be one for cars that are claimed to protect all occupants from serious injury or death.

Related Items

Training / Education
2010-03-15
Standard
2015-12-17
Training / Education
2013-02-20
Article
2016-11-15
Training / Education
2010-03-15