It has been widely documented that the complexity of system-on-chip (SoC) designs is increasing exponentially, with most SoCs now including multi-threaded processors and many memories with multiple clock domains.
The ITRS report of 2010 shows that the number of processors for portable consumer devices is expected to increase ten-fold between 2009 and 2016, with the performance of each processor leaping 50x over the same time period. ITRS prognosticates that consumer SoC devices could embed 75 processors 10 years from now. Of course, this will be achieved under the constraint of a constant power budget. And, to top it all, design and verification schedules are shrinking. Somehow, it is not surprising a study conducted by Mentor Graphics reported that more than 70% of designs need at least two respins.
The impact of this torrent of complexity on verification has been the reported demise of simulation for SoC-level verification. Nothing could be further from the truth. Simulation is still the weapon of choice for IP-level verification where the acceptance of the System Verilog language and Universal Verification Methodology (UVM) is widespread, including randomization as a key part of the verification coverage strategy.
Actually, the verification challenge has moved to the SoC integration stage. In early SoC designs, different IP blocks operated independently and the verification task consisted mostly of checking the interconnect between these blocks and their associated memories. Today, the design of an SoC with multiple embedded processors and multiple shared memories requires thorough verification of the complex network of interconnection. Resource sharing under tight performance and power requirements gives rise to problems such as coherency and concurrency that go well beyond the scope of simple interconnect checking.
The preferred verification approach at the system level is to run tests written in the C language because these tests can be run on the register transfer level (RTL) representation of the embedded processors in simulation. However, there has been a key productivity hurdle: C tests had to be developed manually since no automated tool existed to generate the tests and no randomization was available. These directed tests are difficult to develop and debug for complex scenarios and their maintenance is tedious because they need to cover the many iterations of the evolving new design. The result is that complex SoCs are taped out with limited simulation performed at the system level.
The fallback is to take a quantum leap in verification from simulating a limited number of directed C tests to running production applications on an emulator. This strategy seems to have spurred a healthy increase in the adoption of emulation and prototyping benefiting the three leading emulation vendors –– Cadence, Mentor and Synopsys through its acquisition of EVE –– who are enjoying robust growth. The buzz is that this market dynamic pushed Synopsys’ management team to overcome its reluctance of hardware to finally acquire emulation vendor EVE. Of course, the renewal of Synopsys’ contract with Intel, rumored to be EVE’s largest customer, did not hurt this union.
This strategy creates a big “verification void” in the SoC flow between a few handcrafted simulated C tests at one end and, at the other end of the spectrum, application software running on the emulator. First of all, from the perspective of the verification engineers, they are forced to give up the flexibility of their simulation environments early in the SoC verification process.
At this stage of the SoC verification process, when RTL bugs are still found at a high frequency, simulation is the preferred debug tool because it is more flexible than the emulator. Having to transition to emulation because few SoC-level tests are available negatively impacts productivity. Also, when operating on the emulator, the application software used for verification is not stressing the design. That’s because this software is typically well behaved, not the kind of code that systematically exercises complex operation corners of the SoC, such as multi-CPU interactions, cache coherency or power domains turn-on/-off. The result is cursory verification at the SoC level, sometimes labeled “stitch and ship.”
What if there was a way to automatically create a set of SoC-level tests that runs on the embedded processors and exercises all the complex operating modes of the chip, yet is sufficiently targeted to run efficiently using simulation? That is the promise of a new set of functional verification tools, including Breker’s TrekSoC product, that automatically generate self-verifying C test cases for the thorough and quick verification of SoCs. This new class of tools called “SoC Verification” enables verification engineers to work longer in the preferred simulation environment and delay the transition to emulation until the RTL design is stable enough to run application software. With it comes the new SoC Verification revolution motto: Simulation isn’t dead, long live simulation!
About Michel Courtoy
Michel Courtoy began his career at Intel in design engineering and software engineering. He managed product marketing for layout verification software at Cadence Design Systems. As vice president of marketing for Silicon Perspective, Courtoy created the market for silicon virtual prototyping and was a key player in its acquisition by Cadence in 2001. He served as a vice president at Cadence before becoming the CEO at Certess, leading Certess through sales growth to a successful exit by acquisition. Courtoy holds a Bachelor of Science degree in electrical engineering from University Catholique de Louvain, Belgium; a Master of Science degree in Electrical Engineering from University of California, San Diego; and an MBA from Santa Clara University in Santa Clara, Calif.