feature article
Subscribe Now

Expanding the Scope of Verification

Transcending Functional Correctness

Looking at the agenda for the 2017 edition of the annual DVCon – arguably the industry’s premiere verification conference, one sees precisely what one would expect: tutorials, keynotes, and technical sessions focused on the latest trends and techniques in the ever-sobering challenge of functional verification in the face of the relentless advance of Moore’s Law.

For five decades now, our designs have approximately doubled in complexity every two years. Our brains, however, have not. Our human engineering noggins can still process just about the same amount of stuff that we could back when we left college, assuming we haven’t let ourselves get too stale. That means that the gap between what we as engineers can understand and what we can design has been growing at an exponential rate for over fifty years. This gap has always presented the primary challenge for verification engineers and verification technology. Thirty years ago, we needed to verify that a few thousand transistors were toggling the right ways at the right times. Today, that number is in the billions. In order to accomplish that and span the complexity gap, we need significant leverage.

The basic fundamentals of verification have persisted. Logic simulation has always been a mainstay, processing vectors of stimuli and expected results as fast and accurately as possible – showing us where our logic or timing has gone awry. Along the way, we started to pick up formal methods – giving us a way to “prove” that our functionality was correct, rather than trying to exhaustively simulate the important or likely scenarios. Parallel to those two avenues of advancement, we have been constantly struggling to optimize and accelerate the verification process. We’ve proceduralized verification through standards-based approaches like UVM, and we’ve worked to accelerate the execution of our verification processes through technologies such as FPGA-based prototyping and emulation. 

Taking advantage of Moore’s Law performance gains in order to accelerate the verification of our designs as they grow in complexity according to Moore’s Law is, as today’s kids would probably say, “Kinda’ meta.” But Moore’s Law alone is not enough to keep up with Moore’s Law. It’s the classic perpetual-motion conundrum. There are losses in the system that prevent the process from being perfectly self-sustaining. Each technology-driven doubling of the complexity of our designs does not yield a doubling of the computation that can be achieved. We gradually accrue a deficit. 

And the task of verification is constantly expanding in other dimensions as well. At first, it was enough to simply verify that our logic was correct – that the 1s, 0s, and Xs at the inputs would all propagate down to the correct results at the outputs. On top of that, we had to worry about timing and temporal effects on our logic. As time passed, it became important to verify that embedded software would function correctly on our new hardware, and that opened up an entire new world of verification complexity. Then, people got cranky about manufacturing variation and how that would impact our verification results. And we started to get curious about how things like temperature, radiation, and other environmental effects would call our verification results into question. 

Today, our IoT applications span vast interconnected systems from edge devices with sensors and local compute resources through complex communication networks to cloud-based computing and storage centers and back again. We need to verify not just the function of individual components in that chain, but of the application as a whole. We need to confirm not simply that the application will function as intended – from both a hardware and software perspective – but that it is secure, robust, fault-tolerant, and stable. We need to assure that performance – throughput and latency – are within acceptable limits, and that power consumption is minimized. This problem far exceeds the scope of the current notion of “verification” in our industry.

Our definition of “correct” behavior is growing increasingly fuzzy over time as well. For example, determining whether a processed video stream “looks good” is almost impossible from a programmatic perspective. The only reliable metric we have is human eyes subjectively staring at a screen. There are many more metrics for system success that have followed similar subjectivity issues. As our digital applications interact more and more directly and intimately with our human, emotional, analog world, our ability to boil verification down to a known set of zeros and ones slips ever farther from our grasp.

The increasing dominance of big data and AI-based algorithms further complicate the real-world verification picture. When the behavior of both hardware and software is too complex to model, it is far too complex to completely verify. Until some radical breakthrough occurs in the science of verification itself, we will have to be content to verify components and subsystems along fairly narrow axes and hope that confirming the quality of the flour, sugar, eggs, butter, and baking soda somehow verifies the deliciousness of the cookie. 

There is no question that Moore’s Law is slowly grinding to a halt. And, while that halt may give us a chance to grab a breath from the Moore’s Law verification treadmill, it will by no means bring an end to our verification challenges. The fact is – if Moore’s Law ends today, we can already build systems far too complex to verify. If your career is in verification, and you are competent, your job security future looks pretty rosy.

But this may highlight a fundamental issue with our whole notion of verification. Verification somewhat tacitly assumes a waterfall development model. It presupposes that we design a new thing, then we verify our design, then we make and deploy the thing that we developed and verified. However, software development (and I’d argue that the development of all complex hardware/software applications such as those currently being created for IoT) follows something much more akin to agile development – where verification is a continual ongoing process as the applications and systems evolve over time after their initial deployment.

So, let’s challenge our notion of the scope and purpose of verification. Let’s think about how verification serves our customers and our business interests. Let’s re-evaluate our metrics for success. Let’s consider how the development and deployment of products and services has changed the role of verification. Let’s think about how our technological systems have begun to invert – where applications now span large numbers of diverse systems, rather than being contained within one. Moore’s Law may end, but our real work in verification has just begun.

Leave a Reply

featured blogs
Dec 3, 2021
Hard to believe it's already December and 11/12ths of a year's worth of CFD is behind us. And with the holidays looming, it's uncertain how many more editions of This Week in CFD are... [[ Click on the title to access the full blog on the Cadence Community sit...
Dec 3, 2021
Explore automotive cybersecurity standards, news, and best practices through blog posts from our experts on connected vehicles, automotive SoCs, and more. The post How Do You Stay Ahead of Hackers and Build State-of-the-Art Automotive Cybersecurity? appeared first on From Si...
Dec 3, 2021
Believe it or not, I ran into John (he told me I could call him that) at a small café just a couple of evenings ago as I pen these words....
Nov 8, 2021
Intel® FPGA Technology Day (IFTD) is a free four-day event that will be hosted virtually across the globe in North America, China, Japan, EMEA, and Asia Pacific from December 6-9, 2021. The theme of IFTD 2021 is 'Accelerating a Smart and Connected World.' This virtual event ...

featured video

Emulation and Prototyping to Accelerate Your Product Development Process

Sponsored by Cadence Design Systems

Validate your most sophisticated SoC designs before silicon and stay on schedule. Full system verification and early software development is possible with Cadence Palladium and Protium Dynamic Duo for IP/SoC verification, hardware and software regressions, full system verification, and early software development.

Click here for more information about Emulation and Prototyping from Cadence Design Systems

featured paper

Add Authentication Security to Automotive Endpoints Using the 1-Wire Interface

Sponsored by Analog Devices

By adding a single authentication IC, automotive designers can authenticate a component with only one signal between an ECU and endpoint component. This is particularly important as counterfeit and theft are increasingly problems in automotive applications. This application note describes how to implement the DS28E40 Deep Cover 1-Wire Authenticator in a system to provide authentication for optical cameras, headlamps, EV Batteries, occupancy sensors, and even steering wheels, and more.

Click to read more

featured chalk talk

Vibration Sensing with LoRaWAN

Sponsored by Mouser Electronics and Advantech

Vibration sensing is an integral part of today’s connected industrial designs but Bluetooth, WiFi, and Zigbee may not be the best solution for this kind of sensing in an industrial setting. In this episode of Chalk Talk, Amelia Dalton chats with Andrew Lund about LoRaWAN for vibration sensing. They investigate why LoRaWAN is perfect for industrial vibration sensing, the role that standards play in these kinds of systems, and why 3 axis detection and real-time monitoring can make all the difference in your next industrial design.

Click here for more information about Advantech WISE-2410 LoRaWAN Wireless Sensor