feature article
Subscribe Now

The Day The Machines Took Over (The Wheel)

Intel Powers Audi A8 to Level 3 Automation

We’ve all been waiting for the day when the machines take over the world. In fact, if you read this publication, you’re almost certainly spending most of your working day (even the part suffering through meetings) enabling the machines to take over the world. After all, that’s the whole purpose of technology – to build machines that make our lives easier by automating tasks for humans. Of course, sometimes that means we end up handing over control of jobs that we enjoy, or that we perform to earn a living, or that have a potentially dramatic impact on our safety. Autonomous driving is all of those things.

Over 75 years ago, in 1942, in his short story “Runaround,” Isaac Asimov famously penned his “Three Laws of Robotics.” The Three Laws enumerated the responsibilities of robots in priority order. One could summarize them as, “Hey robots, please don’t hurt us.” In January 2014, the Society of Automotive Engineers (SAE) issued SAE International’s J3016, which defined six levels (numbered 0-5) of driving automation for on-road vehicles. These too could be summarized as, “Hey robots, please don’t hurt us.”

In their quest to take over the day-to-day driving while mostly adhering to the “Please don’t hurt us” directive, robots (like teenagers) will be given driving responsibilities gradually, phasing in more independence and autonomy slowly as they prove themselves worthy. Unlike most teenagers, however, the first production car the robots will get to drive is an Audi A8 (Eat your hearts out, kids). The 2018 A8 will be the first production car to reach J3016 Level 3, which is the first level that qualifies as an “automated driving system.”. Yep, the robots are getting their learners’ permits.

Reviewing (and paraphrasing) the levels quickly:

Level 0 – no automation. You’re on your own, bub.
Level 1 – in specific mode(s) (or situations) the system and the human driver are both responsible for steering and acceleration/deceleration.
Level 2 – in specific mode(s) (or situations), the system is responsible for steering and acceleration/deceleration, with the human driver in charge of monitoring the driving environment.
Level 3 – in specific mode(s) (or situations), the system is responsible for steering, acceleration/deceleration, and monitoring the driving environment. The human driver’s attention is not required, but the human driver must be available to take over when requested by the system.
Level 4 – in specific mode(s) (or situations), the system is responsible for all driving tasks.
Level 5 – in all situations, the system is responsible for all driving tasks.

Autonomous driving isn’t just technology for technology’s sake, or an indulgent science fair experiment for ambitious engineers. On US roads alone, 30,000 people die each year, and it is estimated that technology could reduce those deaths by as much as 90%. And safety is just the tip of the iceberg in terms or economic benefits. But, reducing 30,000 deaths by 90% still leaves 3,000 annual fatalities, and rather than blaming the (often deceased) human driver for those fatalities, the families and friends of those deceased will often be able to find fault with the companies (and the engineers) who designed and built automated driving systems. Yep, if these predictions are accurate, we engineers will kill an estimated 3,000 people per year as a result of our own human failings – in order to save an estimated 27,000 people (who will ironically never know they’ve been saved) from falling victim to the human failings of others. It’s a hefty responsibility.

With that kind of weight on their shoulders, it makes sense that SAE came out with a standard taxonomy and well-defined levels of automation in order to ease us engineers into that peril-laden future as gently and smoothly as possible. J3016 defines six levels from “no automation” to “full automation” for passing the keys to the car over to robots. Levels 0-2 are “driver assistance” where a human driver is charged with monitoring the driving environment. Upon reaching level 3, “Conditional Automation,” however, the responsibility for monitoring the driving environment moves to the machine, with the human driver acting only as “fallback” for when the automated system finds itself unable to handle the situation.

Audi claims that the 2018 A8 will be the first production car to achieve J3016 Level 3 automation. The “Traffic Jam Pilot” mode will take over driving under a specific set of conditions – namely in heavy highway traffic at speeds of less than 37MPH when a physical barrier separates the traffic lane from oncoming traffic. In that situation (unlike with Tesla’s Level 2 system), the human driver gets the OK to stop paying attention. Yup. If you can cross that emotional barrier, and if local laws allow, you have permission to mess with your phone, watch the onboard TV, or focus all your emotional energy on suppressing road rage. The system, in fact, will be monitoring YOU in order to be sure you’re available to take over if and when the robot is unable to continue its duties safely.

The Audi A8 gives some insight into Intel’s strategy with their Programmable Solutions Group (PSG – formerly known as Altera). The A8 will use Intel FPGAs as part of the autonomous driving system, technology from Mobileye (which Intel has announced an intent to acquire), and VxWorks OS from Intel subsidiary Wind River. This disparate collection of technologies works together to provide some form of integrated solution, versus a “components only” offering. However, at this stage, it’s unclear what, if any, level of integration between components brings the value of the solution above the sum of its parts.

FPGAs bring critical compute capability to applications like automated driving, where massive amounts of sensor data must be processed in real time with near-zero latency. The difference in a few milliseconds in applying brakes can be a life-or-death performance issue, and scaled across millions of units operating on the highways, it represents exactly the sort of design criteria that will save or cost lives.

Obviously, Intel intends to bundle technologies from their portfolio to address the automotive industry demands – particularly in areas like ADAS and autonomous driving. Other competitors (such as FPGA rival Xilinx) are targeting the same markets and applications, leveraging partnerships to provide more complete solutions. In order for Intel’s strategy to pay off, the company will need to engineer-in differentiation that makes an all-Intel solution more attractive. Otherwise, the industry will pick and choose components to roll their own.

Leave a Reply

featured blogs
May 20, 2022
I'm very happy with my new OMTech 40W CO2 laser engraver/cutter, but only because the folks from Makers Local 256 helped me get it up and running....
May 20, 2022
This week was the 11th Embedded Vision Summit. So that means the first one, back in 2011, was just a couple of years after what I regard as the watershed event in vision, the poster session (it... ...
May 19, 2022
Learn about the AI chip design breakthroughs and case studies discussed at SNUG Silicon Valley 2022, including autonomous PPA optimization using DSO.ai. The post Key Highlights from SNUG 2022: AI Is Fast Forwarding Chip Design appeared first on From Silicon To Software....
May 12, 2022
By Shelly Stalnaker Every year, the editors of Elektronik in Germany compile a list of the most interesting and innovative… ...

featured video

Intel® Agilex™ M-Series with HBM2e Technology

Sponsored by Intel

Intel expands the Intel® Agilex™ FPGA product offering with M-Series devices equipped with high fabric densities, in-package HBM2e memory, and DDR5 interfaces for high-memory bandwidth applications.

Learn more about the Intel® Agilex™ M-Series

featured paper

Reduce EV cost and improve drive range by integrating powertrain systems

Sponsored by Texas Instruments

When you can create automotive applications that do more with fewer parts, you’ll reduce both weight and cost and improve reliability. That’s the idea behind integrating electric vehicle (EV) and hybrid electric vehicle (HEV) designs.

Click to read more

featured chalk talk

Current Sense Amplifiers: What Are They Good For?

Sponsored by Mouser Electronics and Analog Devices

Not sure what current sense amplifiers are and why you would need them? In this episode of Chalk Talk, Amelia Dalton chats with Seema Venkatesh from Analog Devices about the what, why, and how of current sense amplifiers. They take a closer look at why these high precision current sense amplifiers can be a critical addition to your system and how the MAX40080 current sense amplifiers can solve a variety of design challenges in your next design. 

Click here for more information about Maxim Integrated MAX40080 Current-Sense Amplifiers