Many autonomous-driving development plans call for deploying a handful of solid-state Lidar sensors on each vehicle, but the Lidar modules used for today’s prototype vehicles all are mechanical systems with moving parts. That’s prompted huge interest in Lidar, with OEMs and suppliers racing to invest in non-mechanical technologies.
Several small companies have developed solid-state Lidar technologies that aren’t ready for automotive applications—and some of those have been gobbled up by major automotive companies past 18 months. Ford made a large investment in Velodyne, while ZF bought a 40% stake in Ibeo. Continental acquired Advanced Scientific Concepts. Analog Devices Inc. (ADI) acquired Vescent Photonics Inc.
The interest stems from Lidar’s advanced use of emitted laser light to measure the distance of objects, functioning much like radar. The laser lets the system provide high resolution imagery at night and in rain or snow.
“High-resolution 'flash' Lidar is a necessary technology for autonomous driving because its capabilities are available in all lighting and weather conditions,” said Dean McConnell, Director of Customer Programs, Advanced Driver Assistance Systems, at Continental North America. “We’re capturing images at 30 Hz, constructing 3D point clusters thirty times per second.”
The technology also helps safety systems zero-in on objects of interest. That’s important to determine whether an object is a threat to driving.
“Lidar acts more like the human eye: it views a broad scene, doing a quick scan, then if it sees something interesting, it can focus in on that,” said Chris Jacobs, General Manager of Automotive Safety for ADI.
Lidar providers currently are racing to develop compact solid-state modules because the large mechanical pucks now used by autonomous-driving researchers are too bulky and costly to go into production vehicles. Researchers are striving to shrink sizes and come up with a good combination of distance and field of view.
“Our solid-state box measures 9 x 6 x 6 cm, about the size of two decks of cards,” said Louay Eldada, Quanergy's CEO. “Currently, it has a 120-degree field of view, so with three you have 360 degree coverage. There will always be two in the front, on the right and left sides, and one in the back middle or one on each corner.”
Determining the vehicle's distance to objects, a key parameter for safety, can be increased by narrowing the field of view. Developers are trying to achieve the same distance levels as cameras and radar, with a goal of around 200 m (656 ft). To achieve desirable distance performance, several tradeoffs are being considered. Location points are key parameters that help determine field-of-view coverage; modules looking to sides, for example, won’t need the same range capability as forward-facing units, so their field of view can be wider.
“We’ve demonstrated 70 meters (230 ft) with a 15-degree field of view, which is clearly not sufficient,” said Aaron Jefferson, Director of Product Planning for ZF’s Active and Passive Safety Division. “It needs to go up to 50 or 60 degrees to start. When the cost gets down, it’s conceivable that they could be integrated into taillights and headlights.”
Lidar will complement cameras and radar, providing information that typically will be "fused" with that from other sensors to create a reliable image of vehicle surroundings. All these sensors generate a huge amount of data, making communications and data management an important factor in overall designs.
“3D Lidar sensing will create a significant amount of data, but similar to radar and camera, there are software techniques to help minimize the amount of data, eliminate useless or unimportant data and extract the detail from the data of concern,” Jefferson said. “Furthermore, the techniques used to filter data, group/cluster data, identify objects, etc. also determine the amount of data that needs to be processed, which is the real concern in terms of managing data volume.”
Curiously, no real hurry
Though there’s plenty of development, the market isn’t expected to see much activity for some time. Many engineers say Lidar can develop slowly while waiting for autonomous vehicle designs to solidify. For now, system designers can create prototypes using mechanical components while they wait for next-generation modules.
“Solid-state Lidar will be in production later this year, but for pilots and software development, you don’t need solid-state,” Eldada said. “Though we plan to ship solid state products in Sept., we won’t have automotive-grade parts ready until a year later.”
The rollout of Lidar-equipped vehicles is as murky as the emergence of autonomous cars. Corporate fleet programs like Uber’s autonomous current tests in Pittsburgh may expand into market opportunities before mainstream OEMs start ordering Lidar sensors.
“We’re looking at series production in the 2021 timeframe, but it may happen faster in different segments,” McConnell said. “Some fleet-service companies are aggressive about getting vehicles out with automated driving in a geomapped area.”
Once Lidar is in use, many developers don’t expect it to displace many other sensors. A range of technologies is needed to provide the capability and redundancy needed to drive autonomously in all weather conditions.
“We do not see 3D Lidar as a sensor replacement, but rather as an innovation that can enables the high resolution sensing needed to realize SAE Level 4-plus automated driving,” Jefferson said. “3D solid-state Lidar, camera, radar, ultrasonic sensing and other technologies will continue to play a role—a combination of these will be necessary to properly sense the vehicle environment in 360 degrees, in real time.”
That’s not a universal conclusion, however.
“Ultrasonics will go away,” Eldada countered. “Video is needed for color, things like seeing traffic lights. Fusing Lidar and cameras 'colorizes' our data so it’s more valuable. Radar is needed for redundancy; you need another sensor before deciding to steer or hit the brakes.”