posted by Bryon Moyer
The focus of the directed self-assembly (DSA) discussion at SPIE Advanced Litho has changed. In past years, it has felt more like the efforts were largely about corralling this interesting new wild thing, or even seeing if it could be corralled.
Well, this year it felt more like it’s in the corral, but there’s lots more training to do to make it a well-behaved showhorse. The focus is now on manufacturability. What are the tweaks and changes needed to turn this into a reliable, predictable process?
We’ve covered the basics of DSA before, but part of its distinctive character is in the sensitivity of the self-assembly process to subtle effects. Figuring out what matters and what doesn’t – and how it might be made more robust – is part of what’s going on now.
The biggest topic is defectivity. The desired pattern will be consistent rows or dots; the opposite of that is the fingerprint nature of a randomly ordered pattern. The defects you might see now are such that you might mostly have, say, parallel lines, but occasionally you’ll have one line meandering over to another, or some such hint of the latent fingerprint. Because these defects are different from those you might be used to with more traditional lithography techniques, work is still needed to characterize and measure specific defectivity modes.
You may recall that there are two versions of DSA: chemo-epitaxy and grapho-epitaxy. The former embeds a guide pattern underneath (kind of like damascene-style) where the block copolymers (BCPs) will go; that pattern has a chemical affinity for one of the two polymers, thereby guiding the final pattern. The latter sets up guides in the same plane as the BCP film; it becomes a physical rather than chemical guide. (I saw these guiding lines, typically simply called “guides,” referred to as “weirs” in one presentation).
One trend I noticed was that several presenters saw grapho-epitaxy being preferred for contacts and holes (often abbreviated simply as C/H), while chemo-epitaxy would be preferred for lines and spaces (L/S). One possible reason for that would be that the grapho-epitaxy guides take up space; no lines can go where they are, whereas no such space is lost with chemo-epitaxy.
There was, however, one presentation where grapho-epitaxy was used for L/S, and they over-etched the guides to make them the same width as the final intended lines. In that way, rather than getting in the way of the lines, they actually became lines.
Another trend with grapho-epitaxy is to “brush” various surfaces to further bias the self-assembly. This brushing involves a light coating of a material with an affinity for one of the two BCPs. (It’s kind of a blending of chemo- and grapho- concepts.) Further refinement is such that the sides of the guides would remain brushed, while the bottom surface would be rinsed clear.
This is all about “wetting,” a common thread in a number of presentations. It was not unusual for defects to involve a failure of the BCP material to coat all the surfaces properly; you might end up with voids. This becomes difficult to inspect, since it’s a so-called “3D” defect. In the case of a contact hole, for example, the hole might look great at the surface, but might not have cleared properly at the bottom, and this wouldn’t be apparent from a standard inspection. Better wetting helps this dramatically.
Less intuitive is how the BCPs react to such brushing. My assumption would have been that a material with affinity for, say, polystyrene (PS) – one of the components of the most common BCP, linked with PMMA (polymethyl methacrylate) – would cause the PS to position itself alongside that brushed surface, with the PMMA distancing itself from it. But one presentation seemed to indicate the opposite. I talked to the presenter, and he indicated that with a PS-affinity brush, it would actually be the PMMA that would position itself there. Doesn’t quite match my intuitive sense of “affinity,” but then again, this isn’t an area where I trust my intuition.
Also under investigation are different BCPs – in particular, so-called high-Χ materials (that’s the Greek “chi” – it rhymes in American). Χ, as far as I can tell, is a measure of the energy difference between the two BCPs. The higher that difference, the more the two materials repel each other. Presumably that makes the self-assembly happen, oh, how to say… with a greater sense of purpose – less wishy-washiness.
But it can take a lot of time to experiment with various random combinations of materials. As an alternative, folks are investigating the mixing of other materials into the more common BCPs that have already been studied. This lets them tune the period, which, in some cases, can be predicted linearly with the weight proportion of the additive. It also allows for thicker layers of the BCP film, which helps the manufacturability. Indications were that the process window isn’t compromised by these additives.
The chemical mixtures also affect the processing time. Looked at simplistically, you lay down a uniform mixture of the BCP material and then bake, or anneal, it. During that bake, the molecules diffuse through each other to separate out. That rate of diffusion determines how long a bake is needed, which determines throughput. One paper had reduced a 30-minute bake to 2 minutes. Once the bake is complete and the temperature lowered, the resulting pattern is “frozen” in place.
Presumably, during the bake, you’ve got an initial surge of diffusion as the self-assembly proceeds, which would slow down as the process completes. Timing the bake is critical, since if it’s too short, there will still be lots of molecules that might not have found their final resting place. This would likely vary considerably from lot to lot, so the bake has to be long enough to get past that with plenty of margin for repeatability. Playing with the diffusivity of the materials helps to tune – and hopefully minimize – this bake time.
It’s also important to note that much of the work has been done die-by-die. How uniform these processes will be over an entire 300-mm wafer is still an open question, and it’s the focus of further work.
As with other novel lithography techniques, resist and line-edge roughness are important; we’ll talk about those in a future post. In addition, DSA also has some interesting implications for EDA; stay tuned for more on that.
I’d refer you to further materials, but this show works differently in that proceedings aren’t available until weeks after the event. So all I could take away from it were my notes. Which I won’t refer you to since there’s no way you could read my writing.
posted by Bryon Moyer
It’s one of those things that I sort of assumed had been done a long time ago: using databases for design information. After all, Magma’s initial claim to fame was the single database for a design, with different tools merely acting as different views into that single database.
Well, it turns out that that only applies to the design itself, along with the tools that allow you to do design. It hasn’t applied to verification.
But now it does: Cadence has recently announced their Incisive vManager tool. It’s a client-server implementation of a process that used to be handled on a file basis. And the reason this wasn’t solved by the whole design database thing of years ago is that this database doesn’t store the design: It stores all of the elements of the verification process itself.
What does this allow? Well, for good or ill, it allows many more ways to access the information or run analysis on the results. Different applications can be layered over it so that a manager can track progress while a verification engineer dives in to figure out where critical failures are.
The main goal is productivity. And, given the prevalence of databases for absolutely everything these days, you’d think this would be obvious. But it wasn’t obvious years ago, and EDA tools are complex enough to where legacy gets passed down as long as possible, until the pain gets to the point where a major change is needed.
Cadence decided that point is now. You can check out more in their release.
posted by Bryon Moyer
Not long ago I noted the sudden appearance of various reference designs and platforms and kits intended to take some of the friction out of the process of adopting sensors, especially for the non-sensor-savvy.
Well, it wasn’t an isolated phenomenon: they keep coming.
Since then, I’ve noted the following:
- This one isn’t strictly a sensor kit, but it fits into the whole IoT picture: NFC. ST announced a “discovery kit” that “…contains everything engineers need to start adding NFC connectivity to any kind of electrical device…” It contains the tag, microcontroller, antenna, screen, joystick, and connectors. A premium version includes Bluetooth with audio out and a headset.
- InvenSense announced a wearable platform, which contains “…all of the key functions of a health and fitness wearable device…” Those would comprise motion and pressure sensors, microcontroller, Bluetooth Low-Energy, and their Automatic Activity Recognition software, which provides “always on” functionality.
- Movea announced a sensor hub kit for mobile devices. It’s a “…complete software and hardware package on a Nexus phone…” running Android 4.4 (KitKat). Quoting from their announcement, it includes the following functions (with power indicated on ST’s STM32F401 microcontroller):
- Significant motion detection (<40 mW)
- Step counting (<100 mW)
- Activity monitoring and context awareness (<300 mW)
- Cadency, speed and distance when walking and running
- Energy expenditure
- Context detection for walking, running and in transport
- Extensive library supporting a wide range of sports at >95 percent
- Pedestrian Dead Reckoning (<1.8 mW)
- Step cadency, distance, heading, floor detection
And I assume these won’t be the last… I’ll update occasionally as these fly over the transom.