Mar 10, 2014

Internet of Things- This Season's Trend

posted by Dick Selwood

Later this week I will be reporting on the embedded world conference, where the Internet of Things was the major topic. Just before embedded world was Mobile World Congress, which has become as big a circus as the Consumer Electronics show. There again the Internet of Things was a huge topic.

Today CeBIT opens in Hannover. Once just a specialist computing exhibition and conference, spinning off from the massive industrial exhibition of the Hannover Fair, it too has become enormous. And last night at the opening of the fair by Chancellor Merkel of Germany, the British Prime Minister, David Cameron, announced that the British Government is going to invest £73 million (around $120 million) in research in areas linked to the Internet of Things.

Perhaps not entirely by coincidence, several of the heavy weight British Sunday papers devoted several pages to explaining the Internet of Things to their readers. With all this hype it has the appearance of being another tech bubble. But it would be wrong to dismiss it as that. Whatever the public gesturing, interconnectivity, remote access to monitor and control domestic appliances, and all the other things that are pouring into the Internet of Things soup, these form a trend that is not going to be reversed. The job for engineers is surely to make sure that as these things come together they are secure, safe and reliable

Tags :    0 comments  
Mar 06, 2014

DSA Update

posted by Bryon Moyer

The focus of the directed self-assembly (DSA) discussion at SPIE Advanced Litho has changed. In past years, it has felt more like the efforts were largely about corralling this interesting new wild thing, or even seeing if it could be corralled.

Well, this year it felt more like it’s in the corral, but there’s lots more training to do to make it a well-behaved showhorse. The focus is now on manufacturability. What are the tweaks and changes needed to turn this into a reliable, predictable process?

We’ve covered the basics of DSA before, but part of its distinctive character is in the sensitivity of the self-assembly process to subtle effects. Figuring out what matters and what doesn’t – and how it might be made more robust – is part of what’s going on now.

The biggest topic is defectivity. The desired pattern will be consistent rows or dots; the opposite of that is the fingerprint nature of a randomly ordered pattern. The defects you might see now are such that you might mostly have, say, parallel lines, but occasionally you’ll have one line meandering over to another, or some such hint of the latent fingerprint. Because these defects are different from those you might be used to with more traditional lithography techniques, work is still needed to characterize and measure specific defectivity modes.

You may recall that there are two versions of DSA: chemo-epitaxy and grapho-epitaxy. The former embeds a guide pattern underneath (kind of like damascene-style) where the block copolymers (BCPs) will go; that pattern has a chemical affinity for one of the two polymers, thereby guiding the final pattern. The latter sets up guides in the same plane as the BCP film; it becomes a physical rather than chemical guide. (I saw these guiding lines, typically simply called “guides,” referred to as “weirs” in one presentation).

One trend I noticed was that several presenters saw grapho-epitaxy being preferred for contacts and holes (often abbreviated simply as C/H), while chemo-epitaxy would be preferred for lines and spaces (L/S). One possible reason for that would be that the grapho-epitaxy guides take up space; no lines can go where they are, whereas no such space is lost with chemo-epitaxy.

There was, however, one presentation where grapho-epitaxy was used for L/S, and they over-etched the guides to make them the same width as the final intended lines. In that way, rather than getting in the way of the lines, they actually became lines.

Another trend with grapho-epitaxy is to “brush” various surfaces to further bias the self-assembly. This brushing involves a light coating of a material with an affinity for one of the two BCPs. (It’s kind of a blending of chemo- and grapho- concepts.) Further refinement is such that the sides of the guides would remain brushed, while the bottom surface would be rinsed clear.

This is all about “wetting,” a common thread in a number of presentations. It was not unusual for defects to involve a failure of the BCP material to coat all the surfaces properly; you might end up with voids. This becomes difficult to inspect, since it’s a so-called “3D” defect. In the case of a contact hole, for example, the hole might look great at the surface, but might not have cleared properly at the bottom, and this wouldn’t be apparent from a standard inspection. Better wetting helps this dramatically.

Less intuitive is how the BCPs react to such brushing. My assumption would have been that a material with affinity for, say, polystyrene (PS) – one of the components of the most common BCP, linked with PMMA (polymethyl methacrylate) – would cause the PS to position itself alongside that brushed surface, with the PMMA distancing itself from it. But one presentation seemed to indicate the opposite. I talked to the presenter, and he indicated that with a PS-affinity brush, it would actually be the PMMA that would position itself there. Doesn’t quite match my intuitive sense of “affinity,” but then again, this isn’t an area where I trust my intuition.

Also under investigation are different BCPs – in particular, so-called high-Χ materials (that’s the Greek “chi” – it rhymes in American). Χ, as far as I can tell, is a measure of the energy difference between the two BCPs. The higher that difference, the more the two materials repel each other. Presumably that makes the self-assembly happen, oh, how to say… with a greater sense of purpose – less wishy-washiness.

But it can take a lot of time to experiment with various random combinations of materials. As an alternative, folks are investigating the mixing of other materials into the more common BCPs that have already been studied. This lets them tune the period, which, in some cases, can be predicted linearly with the weight proportion of the additive. It also allows for thicker layers of the BCP film, which helps the manufacturability. Indications were that the process window isn’t compromised by these additives.

The chemical mixtures also affect the processing time. Looked at simplistically, you lay down a uniform mixture of the BCP material and then bake, or anneal, it. During that bake, the molecules diffuse through each other to separate out. That rate of diffusion determines how long a bake is needed, which determines throughput. One paper had reduced a 30-minute bake to 2 minutes. Once the bake is complete and the temperature lowered, the resulting pattern is “frozen” in place.

Presumably, during the bake, you’ve got an initial surge of diffusion as the self-assembly proceeds, which would slow down as the process completes. Timing the bake is critical, since if it’s too short, there will still be lots of molecules that might not have found their final resting place. This would likely vary considerably from lot to lot, so the bake has to be long enough to get past that with plenty of margin for repeatability. Playing with the diffusivity of the materials helps to tune – and hopefully minimize – this bake time.

It’s also important to note that much of the work has been done die-by-die. How uniform these processes will be over an entire 300-mm wafer is still an open question, and it’s the focus of further work.

As with other novel lithography techniques, resist and line-edge roughness are important; we’ll talk about those in a future post. In addition, DSA also has some interesting implications for EDA; stay tuned for more on that.

I’d refer you to further materials, but this show works differently in that proceedings aren’t available until weeks after the event. So all I could take away from it were my notes. Which I won’t refer you to since there’s no way you could read my writing.

Tags :    0 comments  
Mar 05, 2014

Verification Moves to Database

posted by Bryon Moyer

It’s one of those things that I sort of assumed had been done a long time ago: using databases for design information. After all, Magma’s initial claim to fame was the single database for a design, with different tools merely acting as different views into that single database.

Well, it turns out that that only applies to the design itself, along with the tools that allow you to do design. It hasn’t applied to verification.

But now it does: Cadence has recently announced their Incisive vManager tool. It’s a client-server implementation of a process that used to be handled on a file basis. And the reason this wasn’t solved by the whole design database thing of years ago is that this database doesn’t store the design: It stores all of the elements of the verification process itself.

What does this allow? Well, for good or ill, it allows many more ways to access the information or run analysis on the results. Different applications can be layered over it so that a manager can track progress while a verification engineer dives in to figure out where critical failures are.

The main goal is productivity. And, given the prevalence of databases for absolutely everything these days, you’d think this would be obvious. But it wasn’t obvious years ago, and EDA tools are complex enough to where legacy gets passed down as long as possible, until the pain gets to the point where a major change is needed.

Cadence decided that point is now. You can check out more in their release.

Tags :    0 comments  
Get this feed  

Login Required

In order to view this resource, you must log in to our site. Please sign in now.

If you don't already have an acount with us, registering is free and quick. Register now.

Sign In    Register