feature article
Subscribe Now

From Movies to Data Analysis

Coventor’s SEMulator 3D Pivots

If you’ve been paying attention to the various papers at various advanced semiconductor process conferences, there’s a name you’re seeing more and more: Coventor. We’ve looked at them several times before, in the context of both their SEMulator 3D tool and their MEMS+ tool – the former for development of new semiconductor processes and the latter for designing MEMS devices.

Today we’re focusing on the SEMulator 3D tool, whose 6.1 version was recently announced. We’re doing so because the tool has turned a corner on how it’s used. Before going there, let’s talk about where we’ve been first in order to set the stage.

SEMulator 3D is probably most visible through its ability to animate a process flow. Start with some new idea and then simulate it to watch as a virtual wafer gets built. Need to prove that idea to someone else? An investor or a prospect, perhaps? Make a movie of the animation and display it. Convincing proof – and it looks slick. That paper presentation you saw where they animated some part of the process? Good chance that was the work of SEMulator 3D.

But, of course, that’s never been enough. Because as soon as you do one of those, you’re thinking, “Wait, what if I tweak this?” – and off you go again making a new movie, just like a regular Quentin Tarantino (but with less color saturation and less blood). And then you have to deal with variation within some of these tiny, complex features that might act one way on a Sunday and another on a Monday. And so now you have to explore the corners. More movies! Call it a trilogy! A pentology! A myriology!

While those might not be real words, that last one does hint at the scale of what’s going on – “myriad” originally indicating ten thousand (and, for the record, traditionally used as an adjective, not a noun – “We ran myriad experiments,” not “We ran a myriad of experiments” – although, as is the way with the English language, the more people do it wrong, the more that becomes acceptable). Because that’s what’s happening.

Various Solutions to Variation

How we deal with variation has changed over time. Once upon a time, we actually tried to get rid of it. I know – quaint, huh? Then we tried designing to a central process point and simulating corners to make the design resilient in the face of process variation. Now, according to Coventor, that last one has been turned on its head: we start with the corners and find where the best nominal settings will be.

But there’s one thing required to do a good job at that: data. Lots of data. And the big question then becomes, “How best to get that data?” Traditionally, we’ve run wafers, turning various process knobs and measuring the outcomes. But that may mean stealing precious fab time from manufacturing or some other development. And each run takes a long time to process fully. And, while a physical build might seem like the gold standard for getting solid, reliable data, there’s a limit to what you can do.

In reality, you want more than just outcomes. You want all kinds of in-process and internal data as well so that you learn not only what happened, but you also get some clues as to why it happened – and what to change if the outcome isn’t ideal. So you instrument your dice and your wafers as best you can to give you those insights.

But this then becomes familiar territory for EDA types. In fact, historically, it’s been one of the challenges of emulation as opposed to simulation. You get way more speed with emulation, but you want the richness of data access that, in the old days, only simulation could give. Emulators have evolved to address this visibility consideration, but it takes hardware to do that. The thing about simulation – whether circuit or process – is that it’s all make-believe. You can wave your hand and instrument pretty much anything – there’s no node that’s inaccessible.

But here’s where the EDA/process similarities end. While emulators have bulked up on the hardware instrumentation that’s possible, fabs haven’t done so in the same way. So, if you really want all that internal data, process simulation is still the best way to get it.

And get it they have. We saw in a prior piece (the one linked above) that Coventor worked with Imec on process simulation. And they ran 1.4 million wafers – virtually. Yeah, it took about 5 weeks. But that’s for 1.4 million wafers – imagine how long that would take to run in a real fab.

And imagine looking at 1.4 million animations. Or manually scouring the intimate data details from the runs of 1.4 million wafers. This is where the new problem emerges, filed under “B” for “Be careful what you wish for.” Now you’re awash in data – what are you going to do about that?

I Need a Statistician – STAT!

Yup, that’s right. You do what any rightminded technologist would do: bail and hand the problem to someone who knows how to extract insights from large volumes of data: a statistician. Turns out that even many large companies don’t employ in-house statisticians, so, presumably, there’s a robust consulting market for such services.

And, in fact, Coventor did engage the services of statisticians to help them understand how SEMulator 3D could be made to analyze large volumes of data. And that data analysis capability is now a big part of the latest release. Big Data attacks yet another corner of the design tool world.

And we are talking data that’s big. How much data you get per wafer depends, of course, on how much you instrument the wafer – in a virtual sense. But if you’re looking for causes and effects and perhaps second- or third-order behaviors, you need to watch lots of points to catch the nuance. Yeah, many of those points will end up not contributing to a final solution, but you don’t know which data is relevant until you know the answer.

So picture heavily instrumented wafers that are subject to, say, 100,000 overall variations. You’re going to be scooping up data like Ocean Spray scoops up cranberries. And you have to put it someplace quickly so that you can keep up, and then you need to analyze it. It’s that analysis that’s new.

You might wonder what impact so much data analysis might have on the software run times. But Coventor says that it’s not the rate-limiting step – far from it. Running the simulation and capturing the data – much of which can be parallelized – is still the vast bulk of the work to be done. They say that analysis probably scales differently from the data gathering, but that it’s a small enough portion of overall runtime that they’re not focusing on that at the moment.

So there you have it. Once a useful tool for turning out particularly good visualizations of what’s happening to a single wafer as it’s built, SEMulator 3D has now pivoted to the less glamorous, but more powerful, work of making sense out of gobs of data from myriad wafers.

 

More info:

Coventor SEMulator 3D

2 thoughts on “From Movies to Data Analysis”

  1. The player is getting great internal data and process of play. The clues and insight signals are available for this. Converter and good accessible process steps are given for this post. The intimate level of data to be shared for the audience. The circuit of simulation and speed to be covered with good gate way. The scouring data given great details for users. Clear sources and well description is possible to get great outsources with this. Collect more from trusted essay writing company

Leave a Reply

featured blogs
Apr 25, 2024
Cadence's seven -year partnership with'¯ Team4Tech '¯has given our employees unique opportunities to harness the power of technology and engage in a three -month philanthropic project to improve the livelihood of communities in need. In Fall 2023, this partnership allowed C...
Apr 24, 2024
Learn about maskless electron beam lithography and see how Multibeam's industry-first e-beam semiconductor lithography system leverages Synopsys software.The post Synopsys and Multibeam Accelerate Innovation with First Production-Ready E-Beam Lithography System appeared fir...
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured paper

Designing Robust 5G Power Amplifiers for the Real World

Sponsored by Keysight

Simulating 5G power amplifier (PA) designs at the component and system levels with authentic modulation and high-fidelity behavioral models increases predictability, lowers risk, and shrinks schedules. Simulation software enables multi-technology layout and multi-domain analysis, evaluating the impacts of 5G PA design choices while delivering accurate results in a single virtual workspace. This application note delves into how authentic modulation enhances predictability and performance in 5G millimeter-wave systems.

Download now to revolutionize your design process.

featured chalk talk

USB Power Delivery: Power for Portable (and Other) Products
Sponsored by Mouser Electronics and Bel
USB Type C power delivery was created to standardize medium and higher levels of power delivery but it also can support negotiations for multiple output voltage levels and is backward compatible with previous versions of USB. In this episode of Chalk Talk, Amelia Dalton and Bruce Rose from Bel/CUI Inc. explore the benefits of USB Type C power delivery, the specific communications protocol of USB Type C power delivery, and examine why USB Type C power supplies and connectors are the way of the future for consumer electronics.
Oct 2, 2023
26,045 views