feature article
Subscribe Now

The Day The Machines Took Over (The Wheel)

Intel Powers Audi A8 to Level 3 Automation

We’ve all been waiting for the day when the machines take over the world. In fact, if you read this publication, you’re almost certainly spending most of your working day (even the part suffering through meetings) enabling the machines to take over the world. After all, that’s the whole purpose of technology – to build machines that make our lives easier by automating tasks for humans. Of course, sometimes that means we end up handing over control of jobs that we enjoy, or that we perform to earn a living, or that have a potentially dramatic impact on our safety. Autonomous driving is all of those things.

Over 75 years ago, in 1942, in his short story “Runaround,” Isaac Asimov famously penned his “Three Laws of Robotics.” The Three Laws enumerated the responsibilities of robots in priority order. One could summarize them as, “Hey robots, please don’t hurt us.” In January 2014, the Society of Automotive Engineers (SAE) issued SAE International’s J3016, which defined six levels (numbered 0-5) of driving automation for on-road vehicles. These too could be summarized as, “Hey robots, please don’t hurt us.”

In their quest to take over the day-to-day driving while mostly adhering to the “Please don’t hurt us” directive, robots (like teenagers) will be given driving responsibilities gradually, phasing in more independence and autonomy slowly as they prove themselves worthy. Unlike most teenagers, however, the first production car the robots will get to drive is an Audi A8 (Eat your hearts out, kids). The 2018 A8 will be the first production car to reach J3016 Level 3, which is the first level that qualifies as an “automated driving system.”. Yep, the robots are getting their learners’ permits.

Reviewing (and paraphrasing) the levels quickly:

Level 0 – no automation. You’re on your own, bub.
Level 1 – in specific mode(s) (or situations) the system and the human driver are both responsible for steering and acceleration/deceleration.
Level 2 – in specific mode(s) (or situations), the system is responsible for steering and acceleration/deceleration, with the human driver in charge of monitoring the driving environment.
Level 3 – in specific mode(s) (or situations), the system is responsible for steering, acceleration/deceleration, and monitoring the driving environment. The human driver’s attention is not required, but the human driver must be available to take over when requested by the system.
Level 4 – in specific mode(s) (or situations), the system is responsible for all driving tasks.
Level 5 – in all situations, the system is responsible for all driving tasks.

Autonomous driving isn’t just technology for technology’s sake, or an indulgent science fair experiment for ambitious engineers. On US roads alone, 30,000 people die each year, and it is estimated that technology could reduce those deaths by as much as 90%. And safety is just the tip of the iceberg in terms or economic benefits. But, reducing 30,000 deaths by 90% still leaves 3,000 annual fatalities, and rather than blaming the (often deceased) human driver for those fatalities, the families and friends of those deceased will often be able to find fault with the companies (and the engineers) who designed and built automated driving systems. Yep, if these predictions are accurate, we engineers will kill an estimated 3,000 people per year as a result of our own human failings – in order to save an estimated 27,000 people (who will ironically never know they’ve been saved) from falling victim to the human failings of others. It’s a hefty responsibility.

With that kind of weight on their shoulders, it makes sense that SAE came out with a standard taxonomy and well-defined levels of automation in order to ease us engineers into that peril-laden future as gently and smoothly as possible. J3016 defines six levels from “no automation” to “full automation” for passing the keys to the car over to robots. Levels 0-2 are “driver assistance” where a human driver is charged with monitoring the driving environment. Upon reaching level 3, “Conditional Automation,” however, the responsibility for monitoring the driving environment moves to the machine, with the human driver acting only as “fallback” for when the automated system finds itself unable to handle the situation.

Audi claims that the 2018 A8 will be the first production car to achieve J3016 Level 3 automation. The “Traffic Jam Pilot” mode will take over driving under a specific set of conditions – namely in heavy highway traffic at speeds of less than 37MPH when a physical barrier separates the traffic lane from oncoming traffic. In that situation (unlike with Tesla’s Level 2 system), the human driver gets the OK to stop paying attention. Yup. If you can cross that emotional barrier, and if local laws allow, you have permission to mess with your phone, watch the onboard TV, or focus all your emotional energy on suppressing road rage. The system, in fact, will be monitoring YOU in order to be sure you’re available to take over if and when the robot is unable to continue its duties safely.

The Audi A8 gives some insight into Intel’s strategy with their Programmable Solutions Group (PSG – formerly known as Altera). The A8 will use Intel FPGAs as part of the autonomous driving system, technology from Mobileye (which Intel has announced an intent to acquire), and VxWorks OS from Intel subsidiary Wind River. This disparate collection of technologies works together to provide some form of integrated solution, versus a “components only” offering. However, at this stage, it’s unclear what, if any, level of integration between components brings the value of the solution above the sum of its parts.

FPGAs bring critical compute capability to applications like automated driving, where massive amounts of sensor data must be processed in real time with near-zero latency. The difference in a few milliseconds in applying brakes can be a life-or-death performance issue, and scaled across millions of units operating on the highways, it represents exactly the sort of design criteria that will save or cost lives.

Obviously, Intel intends to bundle technologies from their portfolio to address the automotive industry demands – particularly in areas like ADAS and autonomous driving. Other competitors (such as FPGA rival Xilinx) are targeting the same markets and applications, leveraging partnerships to provide more complete solutions. In order for Intel’s strategy to pay off, the company will need to engineer-in differentiation that makes an all-Intel solution more attractive. Otherwise, the industry will pick and choose components to roll their own.

Leave a Reply

featured blogs
Aug 14, 2020
Eeek Alors! We now have a cunning simulator that you can download and use to create and test programs to run on my 12 x 12 ping pong ball array!...
Aug 14, 2020
[From the last episode: We looked at what it means to do machine learning '€œat the edge,'€ and some of the compromises that must be made.] When doing ML at the edge, we want two things: less computing (for speed and, especially, for energy) and smaller hardware that req...
Aug 14, 2020
Cadence ® Spectre ® AMS Designer is a high-performance mixed-signal simulation system. The ability to use multiple engines and drive from a variety of platforms enables you to "rev... [[ Click on the title to access the full blog on the Cadence Community site....
Aug 13, 2020
My first computer put out a crazy 33 MHz of processing power from the 486 CPU. That was on “Turbo Mode” of course, and when it was turned off we were left with 16 MHz. Insert frowny face. Maybe you are too young to remember a turbo button, but if you aren’t ...

Featured Paper

True 3D EM Modeling Enables Fast, Accurate Analysis

Sponsored by Cadence Design Systems

Tired of patchwork 3D EM analysis? Impedance discontinuity can destroy your BER and cause multiple design iterations. Using today’s 3D EM modeling tools can take you days to accurately model the interconnect structures. The Clarity™ 3D Solver lets you tackle the most complex EM challenges when designing systems for 5G, high-performance computing, automotive and machine learning applications. The Clarity 3D Solver delivers gold-standard accuracy, 10X faster analysis speeds and virtually unlimited capacity for true 3D modeling of critical interconnects in PCB, IC package and system-on-IC (SoIC) designs.

Click here for more information

Featured Paper

Improving Performance in High-Voltage Systems With Zero-Drift Hall-Effect Current Sensing

Sponsored by Texas Instruments

Learn how major industry trends are driving demands for isolated current sensing, and how new zero-drift Hall-effect current sensors can improve isolation and measurement drift while simplifying the design process.

Click here for more information

Featured Chalk Talk

Machine Learning at the Edge

Sponsored by Mouser Electronics and NXP

AI and neural networks are part of almost every new system design these days. But, most system designers are just coming to grips with the implications of designing AI into their systems. In this episode of Chalk Talk, Amelia Dalton chats with Anthony Huereca of NXP about the ins and outs of machine learning and the NXP iEQ Machine Learning Software Development Environment.

Click here for more information about NXP Semiconductors i.MX RT1060 EVK Evaluation Kit (MIMXRT1060-EVK)