feature article
Subscribe Now

A Clean-By-Construction Deck

Sage Introduces a DRC Compiler

So you’ve spent many months on a chip design project that’s winding down. Layout is done, or at least you hope it’s done, and it’s time to make sure that you did it right. (Yes, I know… most designs today will involve many people doing different things, so the “you” here is intended to refer collectively to all of you.)

That means it’s time to run your design rule checks (DRCs). And as the computer hums away on your DRC deck (presumably so-called because at one time it was a deck of Hollerith cards), inspecting every nook and cranny (you hope) for violations, did you ever wonder where that deck came from?

It’s easy to assume that there is this methodical process by which some committee of technology dudes (“dude” in the gender-neutral sense) sat around planning a technology and a set of rules to live by, which they then codified that into a DRC deck and pronounced it good.

Well, to hear a new startup, Sage, describe it, seeing where your DRC deck comes from is probably more like watching sausage get made. It may not be something you want to witness. And, in fact, with the size of the decks these days, there’s got to be a veritable sausage-fest going on in the technology houses.

It’s not a matter of codifying knowledge into specific, executable DRC rules. And here’s where we have to define some nuances more closely. I’ve always thought of DRC decks consisting primarily of rules that can be tested as pass or fail. But it’s not quite that simple. Yes, there is a description of rules, but then there’s a lot of specific code that goes into how you identify violations on-screen.

You might think that a layout program would have a layer for the layout graphics and then a separate “signaling” layer for issues, but apparently not. The DRC rule code literally instantiates polygons that point out to you the location of a violation. And it’s up to the coder to decide how best to represent that violation.

Such a rule has to be manually written and coded. Each one. One after another.

And where do the guys writing these rules get their input? From, say, a spreadsheet that carefully describes, in mathematical terms, the rule and any parameters or conditions and specifically how it should be represented on screen? Nope. They get it from a document. A written document called the Design Rule Manual (DRM… no, not that DRM).

The DRM has text descriptions of the rules. The DRC coder reads those descriptions and then decides how to devise a test for them. That means interpreting the intent of the rule correctly (there may be ambiguities). The figuring-out-how-to-test-it part may also not be trivial, and there are cases where the exact rule is too hard to codify and an approximation must do.

So here we have two levels of interpretation. The first is when the DRM author interprets a rule and casts that interpretation into prose, and the second is when the poor guy writing the code reads that prose and decides how to turn that into actual DRC deck code.

This might not be so bad if there weren’t so many rules. In fact, we’ve lived with it for a long time, and it seems to have been manageable as long as the number of rules remained in the few hundreds or so. But those days are gone. Here are some numbers Sage puts forth: a modern DRM is something like a 600-page document containing about 5,000 rules. Those rules will turn into about 100,000 lines of DRC code. All manually written. Text and code.

As a result, it can be an incredible chore to validate the DRC code against the DRM to figure out if all of the rules got covered without escape loopholes and without tightening down too far. Which means that, when the decks are first released, they typically have bugs in them, and it can take years before you can really say that they’re clean.

So Sage is proposing a design rule compiler. OK, more than proposing: they’ve built one and recently introduced it. They call it iDRM. The way it works is by allowing graphical entry of a rule, along with conditions and other complicating factors. The tool will then create an executable check.

From that rule, iDRM can then create test layouts that should pass and fail. This can be used to test the rules by running them against the test structures. They can also take existing layouts and check them against the rules, listing all violations.

The idea here is to skip the whole writing-a-War-and-Peace-tome thing and use this tool instead to codify the rules. You then get a one-to-one correspondence between each rule and a correct-by-construction executable version of the rule.

The guy using this can take advantage of the test structures and layout scans to confirm whether he or she defined the rule properly. If there are too many misses (false negatives), then you can see right then and there that you need to tighten up the rule. If you’re getting too many false positives, then you need to find some way to express the intent of the rule that’s not so restrictive.

The part that’s not done yet is taking those rules and translating them into specific code that can be run by third-party DRC engines like Calibre, IC Validator, or Assura. Sage calls that DRC Synthesis. That’s an obvious next step in providing a complete process-concept-to-runnable-DRC-deck flow. But the focus right now is on getting engineers in various parts of the silicon ecosystem started using iDRM to create the rules. There’s plenty of work to do at that stage. Which presumably gives Sage more breathing space to get the last bit done.

In case any of you are thinking that this is a foundry-only tool, that’s not necessarily the case. Yes, the initial process design kit (PDK) will be developed there, and they will probably constitute the bulk of the users. But fabless houses often supplement the basic PDK with rules of their own, so they would also be able to leverage iDRM to do that. Failure analysis folks may also want to play with it in order to resolve failures or yield issues that may turn out to be related to layout.

As for the rest of you, well, if it all works as promised, you just might sleep a bit better knowing that the deck you’re working with is clean. Or cleaner than usual, anyway. And if you want to go see how that deck got made, well, it might not seem quite as gross anymore.

 

More info:

Sage DA

One thought on “A Clean-By-Construction Deck”

Leave a Reply

featured blogs
Oct 5, 2022
The newest version of Fine Marine - Cadence's CFD software specifically designed for Marine Engineers and Naval Architects - is out now. Discover re-conceptualized wave generation, drastically expanding the range of waves and the accuracy of the modeling and advanced pos...
Oct 4, 2022
We share 6 key advantages of cloud-based IC hardware design tools, including enhanced scalability, security, and access to AI-enabled EDA tools. The post 6 Reasons to Leverage IC Hardware Development in the Cloud appeared first on From Silicon To Software....
Sep 30, 2022
When I wrote my book 'Bebop to the Boolean Boogie,' it was certainly not my intention to lead 6-year-old boys astray....

featured video

PCIe Gen5 x16 Running on the Achronix VectorPath Accelerator Card

Sponsored by Achronix

In this demo, Achronix engineers show the VectorPath Accelerator Card successfully linking up to a PCIe Gen5 x16 host and write data to and read data from GDDR6 memory. The VectorPath accelerator card featuring the Speedster7t FPGA is one of the first FPGAs that can natively support this interface within its PCIe subsystem. Speedster7t FPGAs offer a revolutionary new architecture that Achronix developed to address the highest performance data acceleration challenges.

Click here for more information about the VectorPath Accelerator Card

featured paper

Algorithm Verification with FPGAs and ASICs

Sponsored by MathWorks

Developing new FPGA and ASIC designs involves implementing new algorithms, which presents challenges for verification for algorithm developers, hardware designers, and verification engineers. This eBook explores different aspects of hardware design verification and how you can use MATLAB and Simulink to reduce development effort and improve the quality of end products.

Click here to read more

featured chalk talk

Sensor Technologies Here to Stay: Post-pandemic

Sponsored by Infineon

Today sensor technology has become integral to our everyday lives. And in the future, sensor technology will mean even more than it does today. In this episode of Chalk Talk, Amelia Dalton chats with David Jones from Infineon about the future of sensor technologies and how they are going to impact our lives in the post-pandemic world. They investigate how miniaturization, built-in antennas in-package and the evolution of radar technology have helped usher in a whole new era of sensing technologies and how all of this and more will help us live healthier and happier lives.

Click here for more information about Infineon's sensor technology portfolio