feature article
Subscribe Now

From Movies to Data Analysis

Coventor’s SEMulator 3D Pivots

If you’ve been paying attention to the various papers at various advanced semiconductor process conferences, there’s a name you’re seeing more and more: Coventor. We’ve looked at them several times before, in the context of both their SEMulator 3D tool and their MEMS+ tool – the former for development of new semiconductor processes and the latter for designing MEMS devices.

Today we’re focusing on the SEMulator 3D tool, whose 6.1 version was recently announced. We’re doing so because the tool has turned a corner on how it’s used. Before going there, let’s talk about where we’ve been first in order to set the stage.

SEMulator 3D is probably most visible through its ability to animate a process flow. Start with some new idea and then simulate it to watch as a virtual wafer gets built. Need to prove that idea to someone else? An investor or a prospect, perhaps? Make a movie of the animation and display it. Convincing proof – and it looks slick. That paper presentation you saw where they animated some part of the process? Good chance that was the work of SEMulator 3D.

But, of course, that’s never been enough. Because as soon as you do one of those, you’re thinking, “Wait, what if I tweak this?” – and off you go again making a new movie, just like a regular Quentin Tarantino (but with less color saturation and less blood). And then you have to deal with variation within some of these tiny, complex features that might act one way on a Sunday and another on a Monday. And so now you have to explore the corners. More movies! Call it a trilogy! A pentology! A myriology!

While those might not be real words, that last one does hint at the scale of what’s going on – “myriad” originally indicating ten thousand (and, for the record, traditionally used as an adjective, not a noun – “We ran myriad experiments,” not “We ran a myriad of experiments” – although, as is the way with the English language, the more people do it wrong, the more that becomes acceptable). Because that’s what’s happening.

Various Solutions to Variation

How we deal with variation has changed over time. Once upon a time, we actually tried to get rid of it. I know – quaint, huh? Then we tried designing to a central process point and simulating corners to make the design resilient in the face of process variation. Now, according to Coventor, that last one has been turned on its head: we start with the corners and find where the best nominal settings will be.

But there’s one thing required to do a good job at that: data. Lots of data. And the big question then becomes, “How best to get that data?” Traditionally, we’ve run wafers, turning various process knobs and measuring the outcomes. But that may mean stealing precious fab time from manufacturing or some other development. And each run takes a long time to process fully. And, while a physical build might seem like the gold standard for getting solid, reliable data, there’s a limit to what you can do.

In reality, you want more than just outcomes. You want all kinds of in-process and internal data as well so that you learn not only what happened, but you also get some clues as to why it happened – and what to change if the outcome isn’t ideal. So you instrument your dice and your wafers as best you can to give you those insights.

But this then becomes familiar territory for EDA types. In fact, historically, it’s been one of the challenges of emulation as opposed to simulation. You get way more speed with emulation, but you want the richness of data access that, in the old days, only simulation could give. Emulators have evolved to address this visibility consideration, but it takes hardware to do that. The thing about simulation – whether circuit or process – is that it’s all make-believe. You can wave your hand and instrument pretty much anything – there’s no node that’s inaccessible.

But here’s where the EDA/process similarities end. While emulators have bulked up on the hardware instrumentation that’s possible, fabs haven’t done so in the same way. So, if you really want all that internal data, process simulation is still the best way to get it.

And get it they have. We saw in a prior piece (the one linked above) that Coventor worked with Imec on process simulation. And they ran 1.4 million wafers – virtually. Yeah, it took about 5 weeks. But that’s for 1.4 million wafers – imagine how long that would take to run in a real fab.

And imagine looking at 1.4 million animations. Or manually scouring the intimate data details from the runs of 1.4 million wafers. This is where the new problem emerges, filed under “B” for “Be careful what you wish for.” Now you’re awash in data – what are you going to do about that?

I Need a Statistician – STAT!

Yup, that’s right. You do what any rightminded technologist would do: bail and hand the problem to someone who knows how to extract insights from large volumes of data: a statistician. Turns out that even many large companies don’t employ in-house statisticians, so, presumably, there’s a robust consulting market for such services.

And, in fact, Coventor did engage the services of statisticians to help them understand how SEMulator 3D could be made to analyze large volumes of data. And that data analysis capability is now a big part of the latest release. Big Data attacks yet another corner of the design tool world.

And we are talking data that’s big. How much data you get per wafer depends, of course, on how much you instrument the wafer – in a virtual sense. But if you’re looking for causes and effects and perhaps second- or third-order behaviors, you need to watch lots of points to catch the nuance. Yeah, many of those points will end up not contributing to a final solution, but you don’t know which data is relevant until you know the answer.

So picture heavily instrumented wafers that are subject to, say, 100,000 overall variations. You’re going to be scooping up data like Ocean Spray scoops up cranberries. And you have to put it someplace quickly so that you can keep up, and then you need to analyze it. It’s that analysis that’s new.

You might wonder what impact so much data analysis might have on the software run times. But Coventor says that it’s not the rate-limiting step – far from it. Running the simulation and capturing the data – much of which can be parallelized – is still the vast bulk of the work to be done. They say that analysis probably scales differently from the data gathering, but that it’s a small enough portion of overall runtime that they’re not focusing on that at the moment.

So there you have it. Once a useful tool for turning out particularly good visualizations of what’s happening to a single wafer as it’s built, SEMulator 3D has now pivoted to the less glamorous, but more powerful, work of making sense out of gobs of data from myriad wafers.

 

More info:

Coventor SEMulator 3D

2 thoughts on “From Movies to Data Analysis”

  1. The player is getting great internal data and process of play. The clues and insight signals are available for this. Converter and good accessible process steps are given for this post. The intimate level of data to be shared for the audience. The circuit of simulation and speed to be covered with good gate way. The scouring data given great details for users. Clear sources and well description is possible to get great outsources with this. Collect more from trusted essay writing company

Leave a Reply

featured blogs
May 25, 2023
Register only once to get access to all Cadence on-demand webinars. Unstructured meshing can be automated for much of the mesh generation process, saving significant engineering time and cost. However, controlling numerical errors resulting from the discrete mesh requires ada...
May 24, 2023
Accelerate vision transformer models and convolutional neural networks for AI vision systems with the ARC NPX6 NPU IP, the best processor for edge AI devices. The post Designing Smarter Edge AI Devices with the Award-Winning Synopsys ARC NPX6 NPU IP appeared first on New Hor...
May 8, 2023
If you are planning on traveling to Turkey in the not-so-distant future, then I have a favor to ask....

featured video

Automate PCB P&R Tasks for Designs in Minutes

Sponsored by Cadence Design Systems

Discover how to get a dramatic reduction in design turnaround time by automating your placement, power plane generation, and critical net routing with Cadence® Allegro® X AI technology. Built on and accessed through the Allegro X Design Platform, Allegro X AI reduces P&R tasks from days to minutes with equivalent or higher quality compared with manually designed boards.

Click here for more information

featured contest

Join the AI Generated Open-Source Silicon Design Challenge

Sponsored by Efabless

Get your AI-generated design manufactured ($9,750 value)! Enter the E-fabless open-source silicon design challenge. Use generative AI to create Verilog from natural language prompts, then implement your design using the Efabless chipIgnite platform - including an SoC template (Caravel) providing rapid chip-level integration, and an open-source RTL-to-GDS digital design flow (OpenLane). The winner gets their design manufactured by eFabless. Hurry, though - deadline is June 2!

Click here to enter!

featured chalk talk

How IO-Link® is Enabling Smart Factory Digitization -- Analog Devices and Mouser Electronics
Safety, flexibility and sustainability are cornerstone to today’s smart factories. In this episode of Chalk Talk, Amelia Dalton and Shasta Thomas from Analog Devices discuss how Analog Device’s IO-Link is helping usher in a new era of smart factory automation. They take a closer look at the benefits that IO-Link can bring to an industrial factory environment, the biggest issues facing IO-Link sensor and master designs and how Analog Devices ??can help you with your next industrial design.
Feb 2, 2023
15,565 views