feature article
Subscribe Now

Other People’s Code

You down with OPC?

Ownership is a big thing for an engineer. When you’re done with a project, you can stand back and state proudly, “I made that!”

Well, it used to be that simple. Perhaps now it’s more like, “See your TV? Well, it needs backlights to work and those backlights are divided into zones and something has to decide how to light the zones to keep power usage down and there’s a big chip that controls a lot of this stuff and a portion of the chip handles the zones and that portion has to talk to the rest of the chip over a complex interface, and that interface is really really important for the picture to look good. So, that interface? I made that! OK, I didn’t actually make it, but I designed it!”

Ownership is good. If you’re going to put your name on it, you want to make sure it’s going to be good. Stated conversely, if you want it done right, do it yourself.

Of course, ownership has a darker side, typically characterized by the misleading phrase “NIH” – “Not Invented Here.” Which is misleading because, at best, it typically means, “Not Re-invented Here.” Or, being completely truthful, “Not Done By Me.” Such thinking can go beyond pride of ownership to preservation of employment.

But NDBM is an absurd luxury these days; not even Not Done By Us is possible. Every SoC design will have code from outside the team. We typically lump this into the topic of IP, which we’ve covered a few times in this space. But it’s more complicated than that.

There are actually three different categories of Other People’s Code: IP (which I’ll divide into three categories in a minute), legacy code, and open-source code. And, theoretically, these can apply to any code, whether it expresses something that will end up in hardware or software.

IP is typically divided into two camps, design IP and verification IP. But it can actually be divided three ways: design, verification, and modeling. Modeling IP is also for verification, but it’s for verifying a higher-level architecture. The “verification” IP in this ontology refers to IP that’s used to verify an implementation of the same function. For example, if you buy some IP from one vendor and want to do your own quality-control checking on it, you would obtain independent verification IP to do the testing. On the other hand, if you want a more abstract model of the IP you’ve purchased for architectural work, then you can generate modeling IP from the design IP itself.

Such architectural models are often available from IP vendors themselves, but, assuming they’re derived from the implementation, they can’t really be used to test the implementation. Any bugs in the implementation will end up in the model.

Carbon Design Systems announced the Carbon IP Exchange a couple months ago, which, at first glance, sounded like another attempt at an IP marketplace or clearinghouse. But, in fact, it’s a place to go get modeling IP. In some cases, they’re just redistributing someone else’s models; in other cases, they’re generating C-level models from the RTL design IP (in the case of ARM, they do both of these); and in some cases, they wrap highly configurable IP in a GUI that allows selection of parameters.

That this isn’t verification IP for the purpose of checking out implementations is clear for two reasons: some of the models derive from the implementations, and, in the cases where the model is being configured, there is no link to connect the model configuration to the actual implementation configuration (because, according to Bill Neifert, their CTO, customers haven’t requested that). These both point to the models being used architecturally, not as implementation verification IP.

Legacy code is often simply considered to be internally-generated IP, and companies with well-structured IP acquisition and integration mechanisms allow groups from one division to “publish” their IP for consumption elsewhere in the company. But there’s a critical difference to legacy code, especially when not well managed: while commercial IP has an owner (it’s just not you that owns it), legacy code usually has none. It typically wasn’t designed as an independent product, in which case it won’t have undergone the required quality-control mechanisms that formal IP would (or should) undergo.

So, while commercial IP is hopefully designed to be a drop-in, with no knowledge of the internal workings required (or even allowed), legacy code typically has to be made your own. Someone else may have written it, but buck-stoppage transfers to you when you use the code. So you end up having to study it to make sure that, in the design review (or, heaven forefend, failure analysis meeting), you can stand up for the code. It’s like taking responsibility for the guy you brought to the party – you want to know that he’s not some out-of-control meth-head that’s going to make you look bad in the end.

Of course, you don’t want to look bad with a poor commercial IP choice either, but at least there are due-diligence measures you can use there, and, most importantly, there’s someone still around to blame.

Legacy code would almost seem the most problematic source of code. It’s “ours” (the next best thing to “mine”), so it automatically gains tribal acceptance. It’s also free. And, importantly, it carries forward decisions and conventions previously agreed to, increasing the likelihood that new equipment will work in a manner consistent with old equipment. So it will get used; restarting every design from scratch is not an option. So, absent good internal-IP quality controls, designers will have to be more careful with legacy code than they will with commercial IP.

Finally, we have open source code. And here we need to make a more careful distinction between hardware and software. The concept of open-source hardware is something of a non-starter for most designers.  IPextreme’s Warren Savage says simply that, for hardware, there is “… zero chance of open source.” The reason for that is that open-source IP has none of the benefits of legacy IP and all of the downsides. While you can patch software bugs, you’re not going to risk a mask change simply for the sake of saving a few bucks on design. So open-source hardware code is pretty much dead at present.

Open-source software, on the other hand, is alive and well. It’s really astonishing the number of algorithms and protocols for which open-source implementations exist. Here again, quality should be a concern, but since you’re getting the source code, you can take ownership much the way you would with legacy code.

The one gotcha that remains with open-source software is the licensing. With IP, licensing is explicitly negotiated with payment terms, and companies are used to implementing whatever tracking mechanisms are required to ensure that royalty obligations, or whatever other terms might exist, are met. There is no such negotiation or contract (other than the click-through, whose signing can usually be summarized as, “Yeah, yeah, whatever… <click>”) when open-source code is downloaded.

And there is a variety of styles of license that may apply to any given piece of code. Some, like BSD licensing, are considered more commercially friendly; others, like GPL, are considered less so because they may require you to allow proprietary improvements you make to the code to be available to others for free. Many a manager worries about complex code being “contaminated” by code having a license inconsistent with the planned code deployment.

This is an area that Protecode is trying to address. They have a database of code and licensing that they can use to scan a codebase to see if it contains any licensing issues. They also provide tools and methodologies for managing the licensing obligations across large, complex projects. So, while quality and ownership issues remain, there are attempts to manage the potential legal surprises that open-source code can yield.

So, from an ownership standpoint, you really have two kinds of code: your own code and OPC. IP remains fully OPC since an owner remains for that code: the provider. It’s a bit more tenuous if you’re doing your own implementation and just purchasing verification IP to make sure it’s right – since the actual code to be shipped is yours. But, regardless, someone’s an owner.

Legacy and open-source code, on the other hand, are really orphans. While they had parents, the parents have died or run away or are in rehab. So if you’re going to adopt them, you have to make them feel like part of the family.

Either someone else needs to own it or you need to make it your own. That’s the only way to be down with OPC.

 

More info:

Protecode

Carbon IP Exchange

IPextreme

 

Leave a Reply

featured blogs
Apr 19, 2024
Data type conversion is a crucial aspect of programming that helps you handle data across different data types seamlessly. The SKILL language supports several data types, including integer and floating-point numbers, character strings, arrays, and a highly flexible linked lis...
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...
Apr 18, 2024
See how Cisco accelerates library characterization and chip design with our cloud EDA tools, scaling access to SoC validation solutions and compute services.The post Cisco Accelerates Project Schedule by 66% Using Synopsys Cloud appeared first on Chip Design....

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured chalk talk

One Year of Synopsys Cloud: Adoption, Enhancements and Evolution
Sponsored by Synopsys
The adoption of the cloud in the design automation industry has encouraged innovation across the entire semiconductor lifecycle. In this episode of Chalk Talk, Amelia Dalton chats with Vikram Bhatia from Synopsys about how Synopsys is redefining EDA in the Cloud with the industry’s first complete browser-based EDA-as-a-Service cloud platform. They explore the benefits that this on-demand pay-per use, web-based portal can bring to your next design. 
Jul 11, 2023
32,168 views