editor's blog
Subscribe Now

DSA Update

The focus of the directed self-assembly (DSA) discussion at SPIE Advanced Litho has changed. In past years, it has felt more like the efforts were largely about corralling this interesting new wild thing, or even seeing if it could be corralled.

Well, this year it felt more like it’s in the corral, but there’s lots more training to do to make it a well-behaved showhorse. The focus is now on manufacturability. What are the tweaks and changes needed to turn this into a reliable, predictable process?

We’ve covered the basics of DSA before, but part of its distinctive character is in the sensitivity of the self-assembly process to subtle effects. Figuring out what matters and what doesn’t – and how it might be made more robust – is part of what’s going on now.

The biggest topic is defectivity. The desired pattern will be consistent rows or dots; the opposite of that is the fingerprint nature of a randomly ordered pattern. The defects you might see now are such that you might mostly have, say, parallel lines, but occasionally you’ll have one line meandering over to another, or some such hint of the latent fingerprint. Because these defects are different from those you might be used to with more traditional lithography techniques, work is still needed to characterize and measure specific defectivity modes.

You may recall that there are two versions of DSA: chemo-epitaxy and grapho-epitaxy. The former embeds a guide pattern underneath (kind of like damascene-style) where the block copolymers (BCPs) will go; that pattern has a chemical affinity for one of the two polymers, thereby guiding the final pattern. The latter sets up guides in the same plane as the BCP film; it becomes a physical rather than chemical guide. (I saw these guiding lines, typically simply called “guides,” referred to as “weirs” in one presentation).

One trend I noticed was that several presenters saw grapho-epitaxy being preferred for contacts and holes (often abbreviated simply as C/H), while chemo-epitaxy would be preferred for lines and spaces (L/S). One possible reason for that would be that the grapho-epitaxy guides take up space; no lines can go where they are, whereas no such space is lost with chemo-epitaxy.

There was, however, one presentation where grapho-epitaxy was used for L/S, and they over-etched the guides to make them the same width as the final intended lines. In that way, rather than getting in the way of the lines, they actually became lines.

Another trend with grapho-epitaxy is to “brush” various surfaces to further bias the self-assembly. This brushing involves a light coating of a material with an affinity for one of the two BCPs. (It’s kind of a blending of chemo- and grapho- concepts.) Further refinement is such that the sides of the guides would remain brushed, while the bottom surface would be rinsed clear.

This is all about “wetting,” a common thread in a number of presentations. It was not unusual for defects to involve a failure of the BCP material to coat all the surfaces properly; you might end up with voids. This becomes difficult to inspect, since it’s a so-called “3D” defect. In the case of a contact hole, for example, the hole might look great at the surface, but might not have cleared properly at the bottom, and this wouldn’t be apparent from a standard inspection. Better wetting helps this dramatically.

Less intuitive is how the BCPs react to such brushing. My assumption would have been that a material with affinity for, say, polystyrene (PS) – one of the components of the most common BCP, linked with PMMA (polymethyl methacrylate) – would cause the PS to position itself alongside that brushed surface, with the PMMA distancing itself from it. But one presentation seemed to indicate the opposite. I talked to the presenter, and he indicated that with a PS-affinity brush, it would actually be the PMMA that would position itself there. Doesn’t quite match my intuitive sense of “affinity,” but then again, this isn’t an area where I trust my intuition.

Also under investigation are different BCPs – in particular, so-called high-Χ materials (that’s the Greek “chi” – it rhymes in American). Χ, as far as I can tell, is a measure of the energy difference between the two BCPs. The higher that difference, the more the two materials repel each other. Presumably that makes the self-assembly happen, oh, how to say… with a greater sense of purpose – less wishy-washiness.

But it can take a lot of time to experiment with various random combinations of materials. As an alternative, folks are investigating the mixing of other materials into the more common BCPs that have already been studied. This lets them tune the period, which, in some cases, can be predicted linearly with the weight proportion of the additive. It also allows for thicker layers of the BCP film, which helps the manufacturability. Indications were that the process window isn’t compromised by these additives.

The chemical mixtures also affect the processing time. Looked at simplistically, you lay down a uniform mixture of the BCP material and then bake, or anneal, it. During that bake, the molecules diffuse through each other to separate out. That rate of diffusion determines how long a bake is needed, which determines throughput. One paper had reduced a 30-minute bake to 2 minutes. Once the bake is complete and the temperature lowered, the resulting pattern is “frozen” in place.

Presumably, during the bake, you’ve got an initial surge of diffusion as the self-assembly proceeds, which would slow down as the process completes. Timing the bake is critical, since if it’s too short, there will still be lots of molecules that might not have found their final resting place. This would likely vary considerably from lot to lot, so the bake has to be long enough to get past that with plenty of margin for repeatability. Playing with the diffusivity of the materials helps to tune – and hopefully minimize – this bake time.

It’s also important to note that much of the work has been done die-by-die. How uniform these processes will be over an entire 300-mm wafer is still an open question, and it’s the focus of further work.

As with other novel lithography techniques, resist and line-edge roughness are important; we’ll talk about those in a future post. In addition, DSA also has some interesting implications for EDA; stay tuned for more on that.

I’d refer you to further materials, but this show works differently in that proceedings aren’t available until weeks after the event. So all I could take away from it were my notes. Which I won’t refer you to since there’s no way you could read my writing.

Leave a Reply

featured blogs
Apr 18, 2021
https://youtu.be/afv9_fRCrq8 Made at Target Oakridge (camera Ziyue Zhang) Monday: "Targeting" the Open Compute Project Tuesday: NUMECA, Computational Fluid Dynamics...and the America's... [[ Click on the title to access the full blog on the Cadence Community s...
Apr 16, 2021
Spring is in the air and summer is just around the corner. It is time to get out the Old Farmers Almanac and check on the planting schedule as you plan out your garden.  If you are unfamiliar with a Farmers Almanac, it is a publication containing weather forecasts, plantin...
Apr 15, 2021
Explore the history of FPGA prototyping in the SoC design/verification process and learn about HAPS-100, a new prototyping system for complex AI & HPC SoCs. The post Scaling FPGA-Based Prototyping to Meet Verification Demands of Complex SoCs appeared first on From Silic...
Apr 14, 2021
By Simon Favre If you're not using critical area analysis and design for manufacturing to… The post DFM: Still a really good thing to do! appeared first on Design with Calibre....

featured video

Learn the basics of Hall Effect sensors

Sponsored by Texas Instruments

This video introduces Hall Effect, permanent magnets and various magnetic properties. It'll walk through the benefits of Hall Effect sensors, how Hall ICs compare to discrete Hall elements and the different types of Hall Effect sensors.

Click here for more information

featured paper

Understanding the Foundations of Quiescent Current in Linear Power Systems

Sponsored by Texas Instruments

Minimizing power consumption is an important design consideration, especially in battery-powered systems that utilize linear regulators or low-dropout regulators (LDOs). Read this new whitepaper to learn the fundamentals of IQ in linear-power systems, how to predict behavior in dropout conditions, and maintain minimal disturbance during the load transient response.

Click here to download the whitepaper

featured chalk talk

Thermocouple Temperature Sensor Solution

Sponsored by Mouser Electronics and Microchip

When it comes to temperature monitoring and management, industrial applications can be extremely demanding. With temperatures that can range from 270 to 3000 C, consumer-grade temperature probes just don’t cut it. In this episode of Chalk Talk, Amelia Dalton chats with Ezana Haile of Microchip technology about using thermocouples for temperature monitoring in industrial applications.

More information about Microchip Technology MCP9600, MCP96L00, & MCP96RL00 Thermocouple ICs