Automotive Processors Enable Intelligent Functional Safety 

Gordon Cooper

May 17, 2022 / 5 min read

From safety features like collision avoidance to the self-driving cars that are being tested on highways and city streets, artificial intelligence (AI) technologies play an integral role in modern vehicles. Sophisticated sensors and deep-learning algorithms like neural network models are taking on the role of human drivers, often making faster decisions and contributing to safer roadway experiences.

The real-time, data-intensive computing that goes on under the hood to make these capabilities possible requires high-performance, low-power embedded vision processing solutions. In this blog post, I’ll discuss the key requirements of processors in automotive AI applications and how the processor in your SoC can make all the difference in a car’s ability to brake when it detects a jaywalking pedestrian or swerve as another vehicle veers too close.

ADAS on the Road

Virtual Eyes on the Road

Dozens of sensors placed throughout today’s cars collect a trove of data about everything from acceleration to engine control, pressure, temperature, and velocity. To react instantaneously and appropriately, the data must be processed swiftly, with intelligence providing guidance on the right actions. For example, consider an application like an advanced driver assistance system (ADAS). The combination of radar plus cameras and LiDAR serves as the eyes on the road for ADAS. The higher degree of autonomy in a vehicle (Level 2 and above), the more critical it becomes for radar sensors to collect a wider range of data to provide a full surround view of the car. AI algorithms applied on the data collected by these sensors derive insights that direct the vehicle’s response, such as delineating between a person or an empty box in the middle of the street and braking or swerving accordingly. Another important consideration with radar is the presence of interference. Increasing prevalence of radar systems also increases the possibility of interference, which can be mitigated with a processor that can detect and correct these problems.

Another example where AI plays a useful role is in optimizing battery performance in electric vehicles. AI can predict the battery condition of subset cells and, through neural network implementations, reduce the current sensors in the on-board charging units. Similarly, AI technology can be used to predict the aging behavior of lithium-ion batteries and, thus, help optimize their performance. For instance, Porsche Engineering has been doing so, tapping into data points including temperature, the battery state-of-charge, and the internal resistance of the battery. Its AI technology has been trained to adapt to driver profiles, making the predictions more precise over time.

Even though they are small, physical sensors still take up valuable real estate in a vehicle. And with more sensors needed to support increasingly autonomous functions, the ability to replace some of these physical devices with virtual ones, enhanced with AI, can be a welcome opportunity. Through the use of artificial neural networks (ANNs) and a Kalman filter (an estimation algorithm) in system controllers, virtual sensors can take the place of their physical counterparts. The virtual sensors would be driven through predictive modeling of an actuator or via the use of ANNs and state-space-based observers, maintaining required safety levels while lowering costs.

Functional Safety Requirements for Embedded Vision

For many automotive systems, embedded vision processors are integral components. Embedded vision refers to the use of hardware and technology to support process control and automation. While autonomous vehicles are a common application area, embedded vision also is used in industrial environments, drones, medical devices, robotics, and in security. Embedded vision processors are expected to deliver high-performance processing while offloading the host processor on an automotive SoC. At the same time, they’re also expected to accommodate the stringent power and pricing budgets of embedded applications.

The increasing number of sensors in vehicles, along with complex AI algorithms, demand even greater computational performance from embedded vision processors. Along with the ability to quickly perform complex mathematical calculations, functional safety is another important criterion, given the safety-critical nature of many automotive systems. ISO 26262, which outlines various Automotive Safety Integrity Levels, provides the industry standard for functional safety. The idea is to design electrical/electronic (E/E) systems in such a way that the risk of failure from systemic and random hardware or software faults is minimized. ASIL D marks the highest degree of automotive hazard, requiring the highest degree of rigor in terms of safety assurance.

ARC Processor IP Recognized for Automotive AI Innovation

With our deep experience in the automotive space, Synopsys offers a portfolio of embedded vision processors that meet the performance and safety demands of automotive systems. Our  ARC® EV7x Embedded Vision Processors are fully programmable and configurable IP cores that support the low cost, low power requirements of embedded applications. An optional high-performance deep neural network (DNN) accelerator provides fast, accurate execution of convolutional neural networks (CNNs) and recurrent neural networks (RNNs). An ARC EV7xFS version supports functional safety compliance up to ASIL D. Synopsys ARC MetaWare EV Development Toolkit accelerates application software development cycles.

At the Embedded Vision Summit last night, the ARC EV7xFS Processor IP was awarded “Best Edge Automotive Solution” from the 2022 Edge AI and Vision Product of the Year Awards presented by the Edge AI and Vision Alliance. “Synopsys has been a consistent innovator in embedded vision and AI processors. I congratulate the Synopsys team on earning this distinction for the ARC EV7xFS processor IP, which merges high processing performance for complex neural networks and functional safety in a low-power, configurable core,” said Jeff Bier, founder of the Edge AI and Vision Alliance.

Edge AI + Vision Alliance Product of the Year | Synopsys

Synopsys ARC EV7xFS Functional Safety Processor IP was named Best Edge Automotive Solution

Customers have taken advantage of the ARC EV7x family for AI-fueled automotive applications. For example, Synopsys collaborates closely with Infineon, the industry’s leading automotive semiconductor supplier, to facilitate development of automotive AI and vehicle virtualization applications. Infineon’s AURIX™ TC4x microcontrollers integrate a high-performance AI accelerator, called a parallel processing unit (PPU), powered by the ARC EV Processor IP. “With the AURIX TC4x PPU based on Synopsys ARC EV processors, Infineon and Synopsys are providing affordable AI to support an array of electric vehicle use cases,” said Joerg Schepers, vice president at Infineon. “The proven functional safety option in the ARC EV processor family helps automotive designers more quickly achieve the required certifications for safe, smart vehicles.”

ARC EV7x Processor IP provides a foundation for the next generation of AI engines in the portfolio, namely our newly launched ARC NPX family of neural processor unit (NPU) IP. ARC NPX NPU IP offers the industry’s highest performance as measured in tera operations per second (up to 3,500 TOPS) and support for the latest and most complex neural networks.

Best Edge Automotive Solution Award Recipient | Synopsys

Jeff Bier, founder of the Edge AI and Vision Alliance (left), presents Yankin Tanurhan, sr. vice president of Engineering in the Synopsys Solutions Group, with the Best Edge Automotive Solution for the ARC EV7xFS Processor IP.

Summary

As the automotive industry pushes toward bringing Level 5 fully autonomous cars to our roadways, sensors like radar and cameras along with AI algorithms promise to continue playing significant roles. To derive meaningful insights from the vast amount of data collected in real time, embedded vision processors will be expected to ramp up their processing prowess to quickly crunch through complex mathematical calculations. It’s a tall order, but given the innovations already made, experienced automotive IP developers are well motivated to meet the progress in the AI world.

Continue Reading