feature article
Subscribe Now

Tackling IC Development Management

Sapient IC Aims to Boost Control and Confidence

Product life-cycle management tools are a double-edged sword. They help to solve a serious problem by standardizing how things are done and reducing uncertainty, but they can end up bogging a company down in process and tools development. The more complex the product whose life is being managed, the more this is true.

The name ‘life-cycle” would indicate a womb-to-tomb scope, and it often is, but the challenges of such tools also accrue to the somewhat less comprehensive tools that cover just one major element of the life cycle, whether that be development, production management, or end-of-life management. Even those phases of a product’s life, taken alone, can be inordinately complex.

Exactly how a process like this works at any given company is also a strong function of the culture of the company. If the company culture is one of strict adherence to data, logic, and reason, then you will not stop until you’ve crafted a system that accurately predicts the future five years out (and if it didn’t, then either your data, logic, or reasoning were wrong and you need to work on that). If you’re a nimble, risk-taking company, then you’ll have a methodology predicated on an assumption that there’s only so much you can know, you might be wrong, and you should make allowances for multiple outcomes.

In the software development world, this variety plays out in two extremes. On one end, you get the massive methodologies as embodied in the Rational Unified Process. The focus is on an enormous planning effort in order to facilitate a smooth development phase. The other extreme is the, well, “extreme” or “agile” approach. Pick short-term goals, get them done, and see what happens; then decide what to do next.

It’s like trying to cross the Rocky Mountains. You can hire a plane with a camera to fly high up and take pictures of the area, then combine that with surveying data on elevations and grades and plot the lowest-effort path from where you are to where you want to go, all before taking the first step. Or you can look at the easiest ridge to get to from where you are, get there, and then see what looks like the next easiest target from that vantage point, repeating until you get across.

There’s no one right answer. A multi-bazillion-dollar mega-monster company with 500 people writing many millions of lines of code will descend into absolute chaos trying to behave in an agile fashion. Likewise, a start-up will never get started if it tries a heavyweight approach – it will use up its funding on planning alone.

There’s always risk when developing a new product, but software, by its very nature, gives you lots of flexibility in terms of the deliverable. Not so with ICs – and even less so with complex SoCs. Whereas a couple million dollars might fund the first product of a software startup, it won’t buy the mask set for a modern IC on a leading-edge process node. The entire EDA industry is predicated on the risk and cost of making a mistake on your masks.

So a purely agile approach to IC design is pretty much a non-starter. At the very least, managers would die of heart attacks before the project ever got done. There are too many surprises; by definition, you wouldn’t know what Step 3 is going to be until Step 2 is complete. And Step 3 might turn out to be, “redo Step 2 a different way because Step 2 didn’t work.”

This means that IC development needs more structure. And, in general, this structure comes from process design and, like so many life-cycle management systems, will be peculiar to each company. The problem is, these systems are generally ad hoc – it’s very hard to pin them down into a single, well-understood, common process that all IC companies follow.

Yet the fact remains that project managers are going to want to be able to keep tabs on a project and, in particular, to know how changes as the project progresses will affect the resulting business projections. And this is what Sapient is trying to address with their new Sapient IC product.

What they’ve done is try to create easily-adaptable models in an overall package that can be tailored to any IC project at any company. The trick here is finding that magic flexibility where you can make the tool work for your company and your process without needing a huge project just to get things set up.

Founder and CEO Subash Peddu claims that you can get going in a day or two – and most of that work is collecting the data. Sapient can also provide customization, which can generally be done in three to five days. “We don’t use a Siebel approach,” says Peddu, referring to the entire consulting industry that built up around the Siebel CRM tool in the late ‘90s to adapt the tool to customers. And which crashed and burned in the dot-com bust.

For the most part, the database and models are all built in. You can decide how detailed to get with your modeling, so you don’t have to fill in all the data if there are pieces you don’t have. In some cases, this may affect what analysis you can see (for example, sensitivity analysis wouldn’t be possible if the data to support it weren’t in place… sounds kind of obvious…)

As one example of what can be done, features can be correlated with profit as long as you have numbers for the various feature bundle scenarios. By following two trajectories – the sales that the feature bundles enable and the cost of implementing the feature bundles – you can graph profitability for each bundle. You can also do some risk management through guardbanding of numbers.

Sapient is clear about the fact that they don’t generate the data – they present the data in ways that make it easier to visualize what’s going on. You use the same data you’ve always used, only, in theory, it’s now stored in one common central place for easy retrieval rather than scattered through a variety of emails and reports and spreadsheets on various desks and disks.

It sounds pretty appealing, assuming that the adaptability is really as easy as claimed. And that’s the real key. While processes in IC companies don’t vary as widely as the RUP/Agile gap, they do differ in ways that will matter. It does no good to enter data into a huge model, only to realize that there’s one key thing that can’t be done the way you do it, and the whole thing blows up.

The system – both the entry of data and the use in reporting and reviews – has to be natural for the users or, as IT managers and process mavens around the world can sadly testify, it simply won’t get used. It’ll shrink back into the corner, unloved and unattended, and will be spoken of quietly and reverentially as another of those ideas that was almost good but missed on three of 1528 critical requirements for the thing to be useful.

The other thing that remains to be proven is whether the uncertainties in building an IC can really be captured in such a database. Are the intangibles a bigger issue? It’s almost like an “accuracy vs. precision” thing: “These models demonstrate that our profits will be $3.256 million, plus or minus $25 million depending on what the economy does.”

To be clear, this isn’t my prediction of Sapient’s future. It’s the legacy that Sapient inherits. The good news is that the legacy provides countless examples of mistakes that don’t need to be made again. The challenge is that Sapient has to be mindful of these and be successful in not making those mistakes.

Leave a Reply

featured blogs
Nov 22, 2024
We're providing every session and keynote from Works With 2024 on-demand. It's the only place wireless IoT developers can access hands-on training for free....
Nov 22, 2024
I just saw a video on YouTube'”it's a few very funny minutes from a show by an engineer who transitioned into being a comedian...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Vector Funnel Methodology for Power Analysis from Emulation to RTL to Signoff
Sponsored by Synopsys
The shift left methodology can help lower power throughout the electronic design cycle. In this episode of Chalk Talk, William Ruby from Synopsys and Amelia Dalton explore the biggest energy efficiency design challenges facing engineers today, how Synopsys can help solve a variety of energy efficiency design challenges and how the shift left methodology can enable consistent power efficiency and power reduction.
Jul 29, 2024
90,365 views