feature article
Subscribe Now

Big Data in Semi Manufacturing

Rummaging for Better Results

He looked yet one more time out onto the crime scene. The answer was out there somewhere. But how many times could he come back “with fresh eyes,” hoping to see something he hadn’t seen the last time? Dried leaves turned a particular way… dirt compacted ever more slightly here than there… distinctions hard to make out with the naked eye, but, given enough data and some way to sort through it all, he knew a pattern would emerge that would lead him to the answers he needed.

A few weeks back, we looked at how Big Data was impacting the world of design management. But that’s not the only corner of the semiconductor world infected by the Big Data bug. At this summer’s Semicon West show, I met up with a couple of companies using data analysis to improve different aspects of semiconductor manufacturing.

The idea behind any of these ventures is that there’s a ton of data out there. We use it to a limited extent, but there’s tons more we could learn if we aggregated it and sliced and diced it the proper way. And, if we automate some of that – once we figure out exactly what recipe we want to follow – then we can integrate it into the actual manufacturing flow, providing control feedback as well as alerts and notifications that something might be amiss. Or we might be able to build small, inexpensive equipment that competes with much larger units.

One of the new things about Big Data is a different way of organizing the data crudely (or, more accurately, not organizing it) so that you can keep up with the firehose feeding yet more data into the system, even as impatient analysts want to see the latest NOW and make inferences and see trends before the Other Guy does. The poster child for this new approach is the Hadoop project.

However, one of the reasons for taking that approach is the broad, unpredictable variety of things that need to be cached away – it’s particularly helpful when the incoming data has little or no inherent structure. That’s not the case with the companies we’re going to discuss. They have highly structured data, so they don’t use a Hadoop-like approach. They’re still faced, however, with the challenge of processing the data quickly enough to provide low-latency feedback on an active production line.

The first company we’ll look at is called Optimal+ (in an earlier life, called Optimal Test). Their focus is on analyzing reams and reams of test data and drawing conclusions that can impact current work in progress. They recently announced that they’re aligning with HP Vertica as a database platform for supporting the processing performance they require.

Their baseline product is called Global Ops, and you might call it generic in that it’s simply a way of taking data from all aspects of the manufacturing flow, learning something, and potentially changing the flow. We’re talking, in many cases, incremental tweaks, and the trick is always walking that tightrope to improve yield or throughput without reducing quality.

The idea is to analyze the bejeebus out of the data and come up with ideas for new “rules.” Those rules can then be tested across the range of historical data to show what might have happened had those rules been in place back then. If you like the way things turn out, then – and this is key – you can push the new rules out for immediate implementation across the supply chain. That means fabless companies impacting wafer production or fabful companies impacting an offshore test house.

More specific products can be laid over this Global Ops platform. There’s one for managing the test floor, one for reducing test time, one for detecting outliers, one for preventing escapes, and one optimized for high-volume production – managing the ramp of a new product or the yield of an established product.

Note that all of these solutions are centered on test data. No process monitor data or other internal fab information is used in the analysis. Latency from new test data to completed analysis is in the range of 7-10 minutes.

The second company I spoke with is Nanotronics, and there appear to be a couple of things going on there. These guys focus on inspection at the atomic level – atomic-force microscopy (AFM). They claim to do with relatively small systems a range of things normally covered by families of big inspection platforms from folks like Applied and KLA-Tencor.

These guys instead use computational photography techniques to improve feature resolution, and they use data analysis on a die to identify miniscule features and across an entire wafer to identify macro-level features like a big scratch on a wafer. Such a defect, crossing multiple dice, would be apparent only by analyzing the results of the entire wafer rather than an individual die.

While the larger competing platforms tend to use a variety of wavelengths and approaches for different jobs in order to ensure a reliable “signal,” Nanotronics claims to be able to get that same signal without all of that varied hardware. That said, however, we are talking about a table-top setup, so, even though they provide automation tools like a wafer handler, it’s hard to imagine this running high volumes and putting the other guys out of business.

The aspect of the platform that would appear to fly the Big Data flag more prominently would be their learning capabilities – both unsupervised and supervised; their feature recognition relies on this. The other Big-Data-like characteristic is that, from a single scan, they can then run multiple different analyses on the resulting single dataset (examples they give are die yield analysis and defect detection).

While much of their focus is on semiconductors, they’re also playing in the bioscience areas as well, analyzing viruses, bacteria, and cells. Same basic approach and scale; different features.

One of the big distinctions between these two data-oriented platforms is the openness of the analysis. Optimal+ is specifically about giving data to users and letting them customize the analysis and the learning. Nanotronics, by contrast, keeps its algorithms under the hood, using them as a vehicle for improving the user’s inspection experience. In one case Big Data rises to the fore; in the other, it sinks into the background.

In either case, it changes the rules of the game. Once novel, Big Data is rapidly becoming commonplace as we figure out how to rummage through it all in a productive manner.

 

More info:

Nanotronics

Optimal+

 

One thought on “Big Data in Semi Manufacturing”

Leave a Reply

featured blogs
Nov 27, 2023
Most design teams use the schematic-driven connectivity-aware environment of Virtuoso Layout XL. However, due to the reuse of legacy designs, third-party tools, and the flexibility of the Virtuoso platform, a design can lose binding and connectivity. Despite the layout being ...
Nov 27, 2023
Qualcomm Technologies' SVP, Durga Malladi, talks about the current benefits, challenges, use cases and regulations surrounding artificial intelligence and how AI will evolve in the near future....
Nov 27, 2023
See how we're harnessing generative AI throughout our suite of EDA tools with Synopsys.AI Copilot, the world's first GenAI capability for chip design.The post Meet Synopsys.ai Copilot, Industry's First GenAI Capability for Chip Design appeared first on Chip Design....
Nov 6, 2023
Suffice it to say that everyone and everything in these images was shot in-camera underwater, and that the results truly are haunting....

featured video

Dramatically Improve PPA and Productivity with Generative AI

Sponsored by Cadence Design Systems

Discover how you can quickly optimize flows for many blocks concurrently and use that knowledge for your next design. The Cadence Cerebrus Intelligent Chip Explorer is a revolutionary, AI-driven, automated approach to chip design flow optimization. Block engineers specify the design goals, and generative AI features within Cadence Cerebrus Explorer will intelligently optimize the design to meet the power, performance, and area (PPA) goals in a completely automated way.

Click here for more information

featured paper

3D-IC Design Challenges and Requirements

Sponsored by Cadence Design Systems

While there is great interest in 3D-IC technology, it is still in its early phases. Standard definitions are lacking, the supply chain ecosystem is in flux, and design, analysis, verification, and test challenges need to be resolved. Read this paper to learn about design challenges, ecosystem requirements, and needed solutions. While various types of multi-die packages have been available for many years, this paper focuses on 3D integration and packaging of multiple stacked dies.

Click to read more

featured chalk talk

Advancements in Motor Efficiency Enables More Sustainable Manufacturing
Climate change is encouraging the acceleration of sustainable and renewable manufacturing processes and practices and one way we can encourage sustainability in manufacturing is with the use of variable speed drive motor control. In this episode of Chalk Talk, Amelia Dalton chats with Maurizio Gavardoni and Naveen Dhull from Analog Devices about the wide ranging benefits of variable speed motors, the role that current feedback plays in variable speed motor control, and how precision measurement solutions for current feedback can lead to higher motor efficiency, energy saving and enhanced sustainability.
Oct 19, 2023
4,895 views