feature article
Subscribe Now

Expanding the Scope of Verification

Transcending Functional Correctness

Looking at the agenda for the 2017 edition of the annual DVCon – arguably the industry’s premiere verification conference, one sees precisely what one would expect: tutorials, keynotes, and technical sessions focused on the latest trends and techniques in the ever-sobering challenge of functional verification in the face of the relentless advance of Moore’s Law.

For five decades now, our designs have approximately doubled in complexity every two years. Our brains, however, have not. Our human engineering noggins can still process just about the same amount of stuff that we could back when we left college, assuming we haven’t let ourselves get too stale. That means that the gap between what we as engineers can understand and what we can design has been growing at an exponential rate for over fifty years. This gap has always presented the primary challenge for verification engineers and verification technology. Thirty years ago, we needed to verify that a few thousand transistors were toggling the right ways at the right times. Today, that number is in the billions. In order to accomplish that and span the complexity gap, we need significant leverage.

The basic fundamentals of verification have persisted. Logic simulation has always been a mainstay, processing vectors of stimuli and expected results as fast and accurately as possible – showing us where our logic or timing has gone awry. Along the way, we started to pick up formal methods – giving us a way to “prove” that our functionality was correct, rather than trying to exhaustively simulate the important or likely scenarios. Parallel to those two avenues of advancement, we have been constantly struggling to optimize and accelerate the verification process. We’ve proceduralized verification through standards-based approaches like UVM, and we’ve worked to accelerate the execution of our verification processes through technologies such as FPGA-based prototyping and emulation. 

Taking advantage of Moore’s Law performance gains in order to accelerate the verification of our designs as they grow in complexity according to Moore’s Law is, as today’s kids would probably say, “Kinda’ meta.” But Moore’s Law alone is not enough to keep up with Moore’s Law. It’s the classic perpetual-motion conundrum. There are losses in the system that prevent the process from being perfectly self-sustaining. Each technology-driven doubling of the complexity of our designs does not yield a doubling of the computation that can be achieved. We gradually accrue a deficit. 

And the task of verification is constantly expanding in other dimensions as well. At first, it was enough to simply verify that our logic was correct – that the 1s, 0s, and Xs at the inputs would all propagate down to the correct results at the outputs. On top of that, we had to worry about timing and temporal effects on our logic. As time passed, it became important to verify that embedded software would function correctly on our new hardware, and that opened up an entire new world of verification complexity. Then, people got cranky about manufacturing variation and how that would impact our verification results. And we started to get curious about how things like temperature, radiation, and other environmental effects would call our verification results into question. 

Today, our IoT applications span vast interconnected systems from edge devices with sensors and local compute resources through complex communication networks to cloud-based computing and storage centers and back again. We need to verify not just the function of individual components in that chain, but of the application as a whole. We need to confirm not simply that the application will function as intended – from both a hardware and software perspective – but that it is secure, robust, fault-tolerant, and stable. We need to assure that performance – throughput and latency – are within acceptable limits, and that power consumption is minimized. This problem far exceeds the scope of the current notion of “verification” in our industry.

Our definition of “correct” behavior is growing increasingly fuzzy over time as well. For example, determining whether a processed video stream “looks good” is almost impossible from a programmatic perspective. The only reliable metric we have is human eyes subjectively staring at a screen. There are many more metrics for system success that have followed similar subjectivity issues. As our digital applications interact more and more directly and intimately with our human, emotional, analog world, our ability to boil verification down to a known set of zeros and ones slips ever farther from our grasp.

The increasing dominance of big data and AI-based algorithms further complicate the real-world verification picture. When the behavior of both hardware and software is too complex to model, it is far too complex to completely verify. Until some radical breakthrough occurs in the science of verification itself, we will have to be content to verify components and subsystems along fairly narrow axes and hope that confirming the quality of the flour, sugar, eggs, butter, and baking soda somehow verifies the deliciousness of the cookie. 

There is no question that Moore’s Law is slowly grinding to a halt. And, while that halt may give us a chance to grab a breath from the Moore’s Law verification treadmill, it will by no means bring an end to our verification challenges. The fact is – if Moore’s Law ends today, we can already build systems far too complex to verify. If your career is in verification, and you are competent, your job security future looks pretty rosy.

But this may highlight a fundamental issue with our whole notion of verification. Verification somewhat tacitly assumes a waterfall development model. It presupposes that we design a new thing, then we verify our design, then we make and deploy the thing that we developed and verified. However, software development (and I’d argue that the development of all complex hardware/software applications such as those currently being created for IoT) follows something much more akin to agile development – where verification is a continual ongoing process as the applications and systems evolve over time after their initial deployment.

So, let’s challenge our notion of the scope and purpose of verification. Let’s think about how verification serves our customers and our business interests. Let’s re-evaluate our metrics for success. Let’s consider how the development and deployment of products and services has changed the role of verification. Let’s think about how our technological systems have begun to invert – where applications now span large numbers of diverse systems, rather than being contained within one. Moore’s Law may end, but our real work in verification has just begun.

Leave a Reply

featured blogs
Sep 30, 2022
When I wrote my book 'Bebop to the Boolean Boogie,' it was certainly not my intention to lead 6-year-old boys astray....
Sep 30, 2022
Wow, September has flown by. It's already the last Friday of the month, the last day of the month in fact, and so time for a monthly update. Kaufman Award The 2022 Kaufman Award honors Giovanni (Nanni) De Micheli of École Polytechnique Fédérale de Lausanne...
Sep 29, 2022
We explain how silicon photonics uses CMOS manufacturing to create photonic integrated circuits (PICs), solid state LiDAR sensors, integrated lasers, and more. The post What You Need to Know About Silicon Photonics appeared first on From Silicon To Software....

featured video

PCIe Gen5 x16 Running on the Achronix VectorPath Accelerator Card

Sponsored by Achronix

In this demo, Achronix engineers show the VectorPath Accelerator Card successfully linking up to a PCIe Gen5 x16 host and write data to and read data from GDDR6 memory. The VectorPath accelerator card featuring the Speedster7t FPGA is one of the first FPGAs that can natively support this interface within its PCIe subsystem. Speedster7t FPGAs offer a revolutionary new architecture that Achronix developed to address the highest performance data acceleration challenges.

Click here for more information about the VectorPath Accelerator Card

featured paper

Algorithm Verification with FPGAs and ASICs

Sponsored by MathWorks

Developing new FPGA and ASIC designs involves implementing new algorithms, which presents challenges for verification for algorithm developers, hardware designers, and verification engineers. This eBook explores different aspects of hardware design verification and how you can use MATLAB and Simulink to reduce development effort and improve the quality of end products.

Click here to read more

featured chalk talk

Sensor Technologies Here to Stay: Post-pandemic

Sponsored by Infineon

Today sensor technology has become integral to our everyday lives. And in the future, sensor technology will mean even more than it does today. In this episode of Chalk Talk, Amelia Dalton chats with David Jones from Infineon about the future of sensor technologies and how they are going to impact our lives in the post-pandemic world. They investigate how miniaturization, built-in antennas in-package and the evolution of radar technology have helped usher in a whole new era of sensing technologies and how all of this and more will help us live healthier and happier lives.

Click here for more information about Infineon's sensor technology portfolio