feature article
Subscribe Now

The Day The Machines Took Over (The Wheel)

Intel Powers Audi A8 to Level 3 Automation

We’ve all been waiting for the day when the machines take over the world. In fact, if you read this publication, you’re almost certainly spending most of your working day (even the part suffering through meetings) enabling the machines to take over the world. After all, that’s the whole purpose of technology – to build machines that make our lives easier by automating tasks for humans. Of course, sometimes that means we end up handing over control of jobs that we enjoy, or that we perform to earn a living, or that have a potentially dramatic impact on our safety. Autonomous driving is all of those things.

Over 75 years ago, in 1942, in his short story “Runaround,” Isaac Asimov famously penned his “Three Laws of Robotics.” The Three Laws enumerated the responsibilities of robots in priority order. One could summarize them as, “Hey robots, please don’t hurt us.” In January 2014, the Society of Automotive Engineers (SAE) issued SAE International’s J3016, which defined six levels (numbered 0-5) of driving automation for on-road vehicles. These too could be summarized as, “Hey robots, please don’t hurt us.”

In their quest to take over the day-to-day driving while mostly adhering to the “Please don’t hurt us” directive, robots (like teenagers) will be given driving responsibilities gradually, phasing in more independence and autonomy slowly as they prove themselves worthy. Unlike most teenagers, however, the first production car the robots will get to drive is an Audi A8 (Eat your hearts out, kids). The 2018 A8 will be the first production car to reach J3016 Level 3, which is the first level that qualifies as an “automated driving system.”. Yep, the robots are getting their learners’ permits.

Reviewing (and paraphrasing) the levels quickly:

Level 0 – no automation. You’re on your own, bub.
Level 1 – in specific mode(s) (or situations) the system and the human driver are both responsible for steering and acceleration/deceleration.
Level 2 – in specific mode(s) (or situations), the system is responsible for steering and acceleration/deceleration, with the human driver in charge of monitoring the driving environment.
Level 3 – in specific mode(s) (or situations), the system is responsible for steering, acceleration/deceleration, and monitoring the driving environment. The human driver’s attention is not required, but the human driver must be available to take over when requested by the system.
Level 4 – in specific mode(s) (or situations), the system is responsible for all driving tasks.
Level 5 – in all situations, the system is responsible for all driving tasks.

Autonomous driving isn’t just technology for technology’s sake, or an indulgent science fair experiment for ambitious engineers. On US roads alone, 30,000 people die each year, and it is estimated that technology could reduce those deaths by as much as 90%. And safety is just the tip of the iceberg in terms or economic benefits. But, reducing 30,000 deaths by 90% still leaves 3,000 annual fatalities, and rather than blaming the (often deceased) human driver for those fatalities, the families and friends of those deceased will often be able to find fault with the companies (and the engineers) who designed and built automated driving systems. Yep, if these predictions are accurate, we engineers will kill an estimated 3,000 people per year as a result of our own human failings – in order to save an estimated 27,000 people (who will ironically never know they’ve been saved) from falling victim to the human failings of others. It’s a hefty responsibility.

With that kind of weight on their shoulders, it makes sense that SAE came out with a standard taxonomy and well-defined levels of automation in order to ease us engineers into that peril-laden future as gently and smoothly as possible. J3016 defines six levels from “no automation” to “full automation” for passing the keys to the car over to robots. Levels 0-2 are “driver assistance” where a human driver is charged with monitoring the driving environment. Upon reaching level 3, “Conditional Automation,” however, the responsibility for monitoring the driving environment moves to the machine, with the human driver acting only as “fallback” for when the automated system finds itself unable to handle the situation.

Audi claims that the 2018 A8 will be the first production car to achieve J3016 Level 3 automation. The “Traffic Jam Pilot” mode will take over driving under a specific set of conditions – namely in heavy highway traffic at speeds of less than 37MPH when a physical barrier separates the traffic lane from oncoming traffic. In that situation (unlike with Tesla’s Level 2 system), the human driver gets the OK to stop paying attention. Yup. If you can cross that emotional barrier, and if local laws allow, you have permission to mess with your phone, watch the onboard TV, or focus all your emotional energy on suppressing road rage. The system, in fact, will be monitoring YOU in order to be sure you’re available to take over if and when the robot is unable to continue its duties safely.

The Audi A8 gives some insight into Intel’s strategy with their Programmable Solutions Group (PSG – formerly known as Altera). The A8 will use Intel FPGAs as part of the autonomous driving system, technology from Mobileye (which Intel has announced an intent to acquire), and VxWorks OS from Intel subsidiary Wind River. This disparate collection of technologies works together to provide some form of integrated solution, versus a “components only” offering. However, at this stage, it’s unclear what, if any, level of integration between components brings the value of the solution above the sum of its parts.

FPGAs bring critical compute capability to applications like automated driving, where massive amounts of sensor data must be processed in real time with near-zero latency. The difference in a few milliseconds in applying brakes can be a life-or-death performance issue, and scaled across millions of units operating on the highways, it represents exactly the sort of design criteria that will save or cost lives.

Obviously, Intel intends to bundle technologies from their portfolio to address the automotive industry demands – particularly in areas like ADAS and autonomous driving. Other competitors (such as FPGA rival Xilinx) are targeting the same markets and applications, leveraging partnerships to provide more complete solutions. In order for Intel’s strategy to pay off, the company will need to engineer-in differentiation that makes an all-Intel solution more attractive. Otherwise, the industry will pick and choose components to roll their own.

Leave a Reply

featured blogs
Oct 23, 2020
Processing a component onto a PCB used to be fairly straightforward. Through hole products, a single or double row surface mount with a larger center-line rarely offer unique challenges obtaining a proper solder joint. However, as electronics continue to get smaller and conne...
Oct 23, 2020
[From the last episode: We noted that some inventions, like in-memory compute, aren'€™t intuitive, being driven instead by the math.] We have one more addition to add to our in-memory compute system. Remember that, when we use a regular memory, what goes in is an address '...
Oct 23, 2020
Any suggestions for a 4x4 keypad in which the keys aren'€™t wobbly and you don'€™t have to strike a key dead center for it to make contact?...
Oct 23, 2020
At 11:10am Korean time this morning, Cadence's Elias Fallon delivered one of the keynotes at ISOCC (International System On Chip Conference). It was titled EDA and Machine Learning: The Next Leap... [[ Click on the title to access the full blog on the Cadence Community ...

featured video

Demo: Inuitive NU4000 SoC with ARC EV Processor Running SLAM and CNN

Sponsored by Synopsys

Autonomous vehicles, robotics, augmented and virtual reality all require simultaneous localization and mapping (SLAM) to build a map of the surroundings. Combining SLAM with a neural network engine adds intelligence, allowing the system to identify objects and make decisions. In this demo, Synopsys ARC EV processor’s vision engine (VPU) accelerates KudanSLAM algorithms by up to 40% while running object detection on its CNN engine.

Click here for more information about DesignWare ARC EV Processors for Embedded Vision

featured Paper

New package technology improves EMI and thermal performance with smaller solution size

Sponsored by Texas Instruments

Power supply designers have a new tool in their effort to achieve balance between efficiency, size, and thermal performance with DC/DC power modules. The Enhanced HotRod™ QFN package technology from Texas Instruments enables engineers to address design challenges with an easy-to-use footprint that resembles a standard QFN. This new package type combines the advantages of flip-chip-on-lead with the improved thermal performance presented by a large thermal die attach pad (DAP).

Click here to download the whitepaper

Featured Chalk Talk

High-Performance Motor Control Solutions Through Integration

Sponsored by Mouser Electronics and Qorvo

Brushless motors have taken over the market for a huge number of applications these days. But, it’s easy to blow up your BOM cost with all the motor control and power management components required. In this episode of Chalk Talk, Amelia Dalton chats with Marc Sousa of Qorvo about the Power Application Controller (PAC) that can lower your BOM, trim down your component list, and give you several other benefits as well.

Click here for more information about Qorvo Power Application Controllers®