posted by Bryon Moyer
Much of the early work on technology is, of course, done in universities. And schools are increasingly collaborating to be more effective. Cornell and Stanford, in particular, are effecting a bicoastal partnership (OK, to those on the East Coast, Ithaca is probably far enough west – being on the foreign side of the Hudson at a latitude where the coast proper has migrated even further east – to qualify as also being on the West Coast, but bear with me.)
They gave a bit of a presentation recently showing what they’re working on. It’s always interesting to see what’s cooking in these kinds of projects. And some interesting work is underway, although a bit more time was devoted to the more political side of things than I found useful (you know, the, “Check us out, how well we’re working together, and thanks to all our funders and please keep sending money” thing, although it’s not quite worded that way. Sorry… my cynical side is showing a bit…)
So the glimpses of actual work were brief, but there are some cool things happening:
- They’re working on a nano-particle-based photoresist for EUV use. These particles are 2-3 nm in diameter and are based on metal oxides. The idea is that an organic shell providing “photochemical cross-linking” encases an inorganic core (the metal oxides, like HfO2 or ZrO2) that resist the etch. They’re able to create nice sharp 20-nm lines at this point. (Ober Group)
- They’re working on better polymers for organic electronics. One example shows the incorporation of fluorine to create polymers that aren’t damaged by the organic solvents frequently used in manufacturing. This makes them immersible and easier to work with. (Ober Group)
- They’re working on the integration of LEDs deep into the chip. (Cornell Nanophotonics Group)
- In a cross between photonics and MEMS (or NEMS), they’re looking at micro-mechanical elements that can be moved by shining photons rather than by some other physical actuator. They call this “optomechanics.” (Cornell Nanophotonics Group)
- “Transformation optics” is the cryptic name given for such things as “cloaking” – that is, making stuff invisible. This involves nano-particles, and is an area of pursuit here. (Cornell Nanophotonics Group)
- They’re also working on fabrics that incorporate nano-materials, primarily by coating the native fibers with a material that has some desired property. Part of it is to be resistant to bacteria (which makes sense for hospital clothing, but, put on normal streetwear, would continue, with unabated hubris, man’s valiant attempts to vanquish all bacteria – even as we discover more about their benefits); fabrics with electronic properties are, of course, also in the works. (Textiles Nanotechnology Laboratory)
- There was also discussion of a “smart bandage” that could communicate remotely, monitor a wound, and administer medicine without the need to rip it up. Imagine someday someone using the by-then idiomatic expression “ripped open that wound” and wondering, “Whatever does that mean anyway? Grampa, did they used to do that in the olden days?”
Of course, before any of this makes it into the real world, they’re also trying to suss out the environmental implications of the particles they’re creating. This stuff doesn’t exist in nature, so there are presumably no mechanisms for dealing with them in living organisms – which could be good or bad. Which is why we need to look into it.
posted by Bryon Moyer
In the wake of the UCIS announcement at DAC (which we’ll cover separately later), I sat down with some of Mentor’s functional verification folks to get an update. Coverage was one of the items on their agenda as part of addressing metric-driven verification.
They talk in terms of changing the engineering mindset when it comes to evaluating verification tools. Right now engineers tend to think in terms of “cycles/second”: how fast can you blaze through these vectors? Mentor is trying to change that thought process to “coverage/cycle”: it’s ok to take longer per cycle (OK, actually, they didn’t explicitly say that – probably a bit dodgy territory from a marketing standpoint – and I don’t know whether they’re solution is any slower on a per-cycle basis – but I’m inferring here…) as long as you get coverage faster. In other words, maybe one tool can zip through a bazillion vectors in three hours, but it’s better to have a tool that only needs a half-bazillion vectors and completes in two hours (slower on a per-vector basis, but faster overall completion).
Part of this is handled by their InFact “intelligent testbench.” They try to solve two problems with it, as I see it. First, there are hard-to-reach states in any design; the tool builds a graph of the design for use in identifying trajectories. From that, they should be able to reach any reachable state with the fewest vectors possible. Which is fine when testing just that one state.
But the second thing they do is what would appear to be their own variation of the “traveling salesman” problem. How do you traverse the graph to get to all the nodes without repeating any path? (The canonical traveling salesman problem is about not repeating any node and ending back where you started.) The idea is to get full coverage with as few vectors as possible. This gets specifically to the “coverage/cycle” metric.
Which reinforces the old truth that simply having and rewarding metrics doesn’t necessarily help things. It’s too easy to have the wrong metrics – which will be attained and for which rewards will be paid – and not improve life. Because they’re the wrong metrics.
Perhaps MDV should be modified to UMDV: Useful-Metric-Driven Verification. Of course, then we’ll get to watch as companies battle over which metrics are useful. But that could make for entertaining viewing too…
posted by Bryon Moyer
The use of photons as signal carriers has historically gone towards long-distance transport, either over the air (feels like waves more than photons) or within fiber. But the distances of interest have dropped dramatically, to the point where there are discussions of using silicon photonics even for on-chip signaling.
In a conversation at Semicon West with imec’s Ludo Deferm, we discussed their current work. At this point, and for at least 10 years out, he doesn’t see CMOS and photonics co-existing on the same wafer. The bottleneck right now isn’t on-chip; it’s chip-to-chip. 40-60 Gb/s internally is fine for now. Which suggests the use of photonics in a separate chip in, for example, a 3D-IC stack or on an interposer: one for routing signals between the chips in the stack.
That photonic chip would be made with the same equipment as a CMOS chip – a specific goal of the imec work in commercializing silicon photonics, but it starts with a different wafer: SOI, with a thinner silicon layer than you would have in a typical CMOS wafer, and with that thickness (or thinness) tightly controlled to reduce optical losses.
You can read more about imec’s progress in their recent announcement.