Intel inside the cloud-based data future

  • 05-Dec-2016 06:17 EST

Kathy Winters listens to a reporter's question at a technology event.  Lindsay Brooke photo

Kathy Winters has been re-introducing herself in auto industry circles lately. Last summer, the well-known Delphi engineering executive joined Intel as General Manager of the Automated Driving Solutions Division (ASD)—the chip maker's-fast growing group dedicated to advanced driver assistance systems (ADAS) and automated-vehicle technology. Soon after, Automotive Engineering and Winters re-connected for a discussion about automated driving's need for intense data-processing capability.

With the increasing use of processors in the vehicle and the need for more processing power, is Intel now officially a Tier 1 in some cases?

We’re still a Tier 2 but kind of a ‘hybrid’—we’re actually a Tier 2 development partner because we’re helping to architect the system. If you look at our engagement with BMW for example, we’re very ‘hands-on,’ working more directly with them and (machine-vision specialist) Mobileye. There will always be a Tier 1 in that mix, I would say, doing the integration. So we’re bridging a gap.

The biggest opportunity for us is to do both the in-vehicle piece—which requires heavy data processing, filtering and transmission for the ADAS sensor fusion—and to understand the 5G connection in the vehicle and then really put in that ‘data center’ with the learnings and apps that have to be there.

The data center will be understanding where the cars around you are, sending out traffic information and helping to optimize road use at the best times.

The ‘data center’ will be in the cloud?

Correct. There is an enormous amount of in-vehicle data that’s coming off the radar, vision systems, LiDAR and from the V2V connection—from other vehicles. The vehicle when it’s in automated mode needs to react and path-plan, executing split-second decision-making. That has to come from the cloud and it involves communicating with other vehicles, all with minimal latency. We need a super-fast computing cloud, so we’ll need 5G: we need the ‘pipeline’ and the speed.

The data center isn’t an off-the-shelf technology solution yet, but do you expect it to be?

Nothing’s off-the-shelf in this space yet. We want to get to a really well-defined platform, potentially something that could be (widely) used and not require everybody in the industry to have to reinvent their own.

The subject of latency and speed to and from the cloud is getting a lot of discussion among ADAS engineers.

Decision processing may have to be less than a millisecond. The vehicle will also need the ability to upgrade and upload new software. Over-the-air (OTA) is driving a lot of memory and storage. Do you keep the old load and the new load in parallel, or wait to completely flash-out a new engine controller? Now that OTA is a reality, how do you partition it and how much storage and memory do you need onboard to make the appropriate upgrades?

What’s your view of the SAE Level 3 and driver re-engagement debate—when and how the automated car hands back control to the human?

Today Intel and others are focused on Levels 2, 3, 4 because active safety is and will continue to be really big business. But some folks are being more disruptive and going straight to the end game of full autonomy. The mobility-on-demand guys want to get the human driver out—that’s what their business case is focused on. They’re the ones driving the pace to Level 4 faster than the folks who are incrementally ratcheting up safety on the way to Level 4. I see a place for both.

This revolution won’t happen all at once.

There will be a very interesting dynamic on our roads during the next 10 years, with traditional car owners driving fully manually in traffic with Level 2 and 3 vehicles as well as robo-fleet vehicles. It will take years to push those older Level 0 vehicles out of the car parc. But all the ‘Smart City’ initiatives are really going to drive more and better data and use cases for our development going forward.

HTML for Linking to Page
Page URL
Rate It
3.00 Avg. Rating

Read More Articles On

Get ready for the “pod” car: Ford announced today that it intends to mass-produce a driverless, fully autonomous vehicle for commercial ride-sharing or ride-hailing service in 2021. The vehicle apparently will be the embodiment of today’s notion of the driverless-vehicle extreme: it will have no steering wheel, brake or accelerator pedals, said president and CEO Mark Fields during the announcement at the company’s Palo Alto, CA, research campus.
Model ASUH miniature hybrid high-integrity pressure transducer from KA Sensors is designed for use in rugged environments from -65 to +300ºF (-54 to +149ºC) and vibration levels of more than 20 g.
Time of flight (ToF) cameras are ready to let drivers control some of the many options of today’s infotainment systems with a mere wave of their hand. ToF-based systems can also monitor drivers to see if they’re drowsy or not watching roadways.
While OEMs wait for NHTSA's V2X mandate they are discussing whether broad usage can be achieved without regulations.

Related Items

Training / Education
Training / Education