feature article
Subscribe Now

Big Data in Semi Manufacturing

Rummaging for Better Results

He looked yet one more time out onto the crime scene. The answer was out there somewhere. But how many times could he come back “with fresh eyes,” hoping to see something he hadn’t seen the last time? Dried leaves turned a particular way… dirt compacted ever more slightly here than there… distinctions hard to make out with the naked eye, but, given enough data and some way to sort through it all, he knew a pattern would emerge that would lead him to the answers he needed.

A few weeks back, we looked at how Big Data was impacting the world of design management. But that’s not the only corner of the semiconductor world infected by the Big Data bug. At this summer’s Semicon West show, I met up with a couple of companies using data analysis to improve different aspects of semiconductor manufacturing.

The idea behind any of these ventures is that there’s a ton of data out there. We use it to a limited extent, but there’s tons more we could learn if we aggregated it and sliced and diced it the proper way. And, if we automate some of that – once we figure out exactly what recipe we want to follow – then we can integrate it into the actual manufacturing flow, providing control feedback as well as alerts and notifications that something might be amiss. Or we might be able to build small, inexpensive equipment that competes with much larger units.

One of the new things about Big Data is a different way of organizing the data crudely (or, more accurately, not organizing it) so that you can keep up with the firehose feeding yet more data into the system, even as impatient analysts want to see the latest NOW and make inferences and see trends before the Other Guy does. The poster child for this new approach is the Hadoop project.

However, one of the reasons for taking that approach is the broad, unpredictable variety of things that need to be cached away – it’s particularly helpful when the incoming data has little or no inherent structure. That’s not the case with the companies we’re going to discuss. They have highly structured data, so they don’t use a Hadoop-like approach. They’re still faced, however, with the challenge of processing the data quickly enough to provide low-latency feedback on an active production line.

The first company we’ll look at is called Optimal+ (in an earlier life, called Optimal Test). Their focus is on analyzing reams and reams of test data and drawing conclusions that can impact current work in progress. They recently announced that they’re aligning with HP Vertica as a database platform for supporting the processing performance they require.

Their baseline product is called Global Ops, and you might call it generic in that it’s simply a way of taking data from all aspects of the manufacturing flow, learning something, and potentially changing the flow. We’re talking, in many cases, incremental tweaks, and the trick is always walking that tightrope to improve yield or throughput without reducing quality.

The idea is to analyze the bejeebus out of the data and come up with ideas for new “rules.” Those rules can then be tested across the range of historical data to show what might have happened had those rules been in place back then. If you like the way things turn out, then – and this is key – you can push the new rules out for immediate implementation across the supply chain. That means fabless companies impacting wafer production or fabful companies impacting an offshore test house.

More specific products can be laid over this Global Ops platform. There’s one for managing the test floor, one for reducing test time, one for detecting outliers, one for preventing escapes, and one optimized for high-volume production – managing the ramp of a new product or the yield of an established product.

Note that all of these solutions are centered on test data. No process monitor data or other internal fab information is used in the analysis. Latency from new test data to completed analysis is in the range of 7-10 minutes.

The second company I spoke with is Nanotronics, and there appear to be a couple of things going on there. These guys focus on inspection at the atomic level – atomic-force microscopy (AFM). They claim to do with relatively small systems a range of things normally covered by families of big inspection platforms from folks like Applied and KLA-Tencor.

These guys instead use computational photography techniques to improve feature resolution, and they use data analysis on a die to identify miniscule features and across an entire wafer to identify macro-level features like a big scratch on a wafer. Such a defect, crossing multiple dice, would be apparent only by analyzing the results of the entire wafer rather than an individual die.

While the larger competing platforms tend to use a variety of wavelengths and approaches for different jobs in order to ensure a reliable “signal,” Nanotronics claims to be able to get that same signal without all of that varied hardware. That said, however, we are talking about a table-top setup, so, even though they provide automation tools like a wafer handler, it’s hard to imagine this running high volumes and putting the other guys out of business.

The aspect of the platform that would appear to fly the Big Data flag more prominently would be their learning capabilities – both unsupervised and supervised; their feature recognition relies on this. The other Big-Data-like characteristic is that, from a single scan, they can then run multiple different analyses on the resulting single dataset (examples they give are die yield analysis and defect detection).

While much of their focus is on semiconductors, they’re also playing in the bioscience areas as well, analyzing viruses, bacteria, and cells. Same basic approach and scale; different features.

One of the big distinctions between these two data-oriented platforms is the openness of the analysis. Optimal+ is specifically about giving data to users and letting them customize the analysis and the learning. Nanotronics, by contrast, keeps its algorithms under the hood, using them as a vehicle for improving the user’s inspection experience. In one case Big Data rises to the fore; in the other, it sinks into the background.

In either case, it changes the rules of the game. Once novel, Big Data is rapidly becoming commonplace as we figure out how to rummage through it all in a productive manner.

 

More info:

Nanotronics

Optimal+

 

One thought on “Big Data in Semi Manufacturing”

Leave a Reply

featured blogs
Mar 28, 2024
The difference between Olympic glory and missing out on the podium is often measured in mere fractions of a second, highlighting the pivotal role of timing in sports. But what's the chronometric secret to those photo finishes and record-breaking feats? In this comprehens...
Mar 26, 2024
Learn how GPU acceleration impacts digital chip design implementation, expanding beyond chip simulation to fulfill compute demands of the RTL-to-GDSII process.The post Can GPUs Accelerate Digital Design Implementation? appeared first on Chip Design....
Mar 21, 2024
The awesome thing about these machines is that you are limited only by your imagination, and I've got a GREAT imagination....

featured video

We are Altera. We are for the innovators.

Sponsored by Intel

Today we embark on an exciting journey as we transition to Altera, an Intel Company. In a world of endless opportunities and challenges, we are here to provide the flexibility needed by our ecosystem of customers and partners to pioneer and accelerate innovation. As we leap into the future, we are committed to providing easy-to-design and deploy leadership programmable solutions to innovators to unlock extraordinary possibilities for everyone on the planet.

To learn more about Altera visit: http://intel.com/altera

featured chalk talk

Advantech Industrial AI Camera: Small but Mighty
Sponsored by Mouser Electronics and Advantech
Artificial intelligence equipped camera systems can be a great addition to a variety of industrial designs. In this episode of Chalk Talk, Amelia Dalton and Ryan Chan from Advantech explore the components included in an industrial AI camera system, the benefits of Advantech’s AI ICAM-500 Industrial camera series and how you can get started using these solutions in your next industrial design. 
Aug 23, 2023
26,438 views