posted by Bryon Moyer
I covered the recent TSensors Summit previously, having attended for one of the three days. That day happened to be dedicated to healthcare, and there were a few interesting points worth noting.
First, I have to say, I was surprised at the number of people that said, “We have the best healthcare system anywhere, and I wouldn’t change it a bit,” followed by a litany of problems with our healthcare system. I don’t know if it was some patriotic thing or an anti-ACA statement or what; it just struck me as incongruous to say that everything is great and then list the things that suck, including facts indicating better outcomes in other countries.
Dr. Mark Zdeblick, of Proteus Digital Health, made an interesting observation: most of today’s electronic healthcare gadgets are for healthy people. These are the things that tell you how many miles you ran or how much of whatever else that only healthy people can do you did. We haven’t actually gotten to the point of improving healthcare yet; we’re mainly maintaining it (for the techno-savvy that can afford it [my editorial, not his]).
The kinds of longer-term items we’re talking about here are patches and ingestibles and such. This is where the daily patch measuring calories in/out to help reduce obesity, if taken by a billion people, gets a third of the way to a trillion yearly sensors. Of course, if it succeeds and we have no more obese people, then that goes against the desire to ship lots of sensors. So we’ll either need a new application or we’ll need to get people to lose some weight, but not enough to be healthy and drop the patch.
Microfluidic labs-on-chips were also a topic, and in particular, it was noted that there are no good design tools for these. Chips and connectors and MEMS have design tools to help, although chips obviously have the most evolved tools due to their complexity and volume. MEMS and other mechanical devices (like connectors) have tools, but abstraction is further behind there (and may not be needed for the simple things like connectors). No such abstraction exists for microfluidics. Opportunity for an EDA company?
Finally, as noted in the other piece, silicon will not be the answer. Part of that is cost – a big part – but part of it relates to putting things on or in the body. Silicon can be used, but sending something with sharp corners and edges through an artery sounds less than savory, so when used, they have to be encapsulated in ways that will be friendly to the body. Lots of work there for folks doing materials and packaging and connections – particularly wireless connectivity.
One quick afterthought: the only really uncomfortable moment in the day occurred when we had to look at that woman’s colon for far longer than seemed necessary. Um… yeah… Nuff said.
posted by Bryon Moyer
I saw an announcement about a new MEMS diagnostic instrument, the M150 from Ardic. It uses a laser to measure the frequency response of a MEMS element. Sounds simple enough. Or it did until I started thinking more about it. After all, lasers require line of sight. And most MEMS elements are far out of sight. And if you bring them into sight, then you may have changed the environment (air instead of vacuum or controlled gas, microphone packaging and cavities, etc.) So how does that affect the measurement?
I had an interesting chat with founder and CEO Edward Chyau to get some perspective on the tool. First, the main use cases: this is typically used either to verify a design early on, for occasional production monitoring, or for diagnosing failed units. So it’s not a high-volume in-line production system. It’s set up to handle dice, although they have had requests for a unit that could handle 6” wafers. (In theory, on a wafer, neighboring dice could interact since they’re mechanically connected, but he said that the actual element movements are so small that by the time their vibrations reached the neighboring die, it would have attenuated to a negligible level.)
And yes, the moving element must be visible. An InvenSense gyro, for example, would need to have its capping ASIC removed to take a measurement. So there will be some difference between the elements behavior under the laser and its behavior in the wild. He says that the tests being run look for significant resonances within the region of interest, indicating that some mechanical element has a resonant mode, due to a design error or a production defect, that can interfere with normal operation. Those modes might shift frequency a little due to depackaging, but not a lot.
So, for example, if a microphone diaphragm exhibited a resonance at 1 MHz, then it wouldn’t be an issue. But if your window was up to 50 kHz and it had a 55 kHz mode, you’d want to pay attention even though it’s nominally outside the window.
It might be possible to anticipate any such shifts at design time. If design simulations were done assuming a vacuum, for example, then it would be possible to redo the simulations without a vacuum to establish a baseline expected behavior for laser probing. That could help set expectations for any change in behavior for correlation if that were a concern.
I also wondered if the laser itself would heat the element during the measurement, changing the performance as it heated. He said that the laser is low-power, like that on a CD-ROM drive, so no noticeable heating should occur during the 60-second scan.
That scan is part of what differentiates this unit from existing devices. Current technology uses laser Doppler interferometry, and it’s largely targeted at the academic environment with relatively high cost and a long learning curve. And a measurement can take 1-2 hours, with reasonably fussy setup.
By contrast, the Ardic system uses astigmatic detection, also used in their atomic-force microscope (AFM) tools. This gives them particularly good z-axis resolution, and they can scan from 1 Hz to 4.2 MHz in 60 seconds. Users often simply run that entire range even if they only need a subset just because it goes so fast.
I also asked whether this technique could apply down to the NEMS range, where quantum effects start to matter. And, in fact, here you’re dealing with features below the visible wavelength. So getting a laser focused that small and capturing the echo isn’t really feasible.
He said that you would likely need to take a more indirect approach. For example, you might have a MEMS-scale cantilever that acts as an atomic-force probe on the NEMS element. It doesn’t actually make contact (although I suppose that, on an atomic scale, where most of the atom is open space, “contact” is not such a clear concept); it just comes within Van der Waals force range. This allows the cantilever to track the motion of the NEMS element; the laser can then be used to measure the cantilever.
But, of course, the cantilever would have its own resonant modes, so you’d have to measure the cantilever first so that you could subtract its behavior out of the combined signal to isolate the NEMS element signal. This hasn’t been done yet, but he saw it as the kind of approach that would be required.
You can read more in their announcement.
posted by Bryon Moyer
A frequent topic at events like the Interactive Technology Summit (ITS) is the increasing presence of sensors in phones and other gadgets. But it’s reasonable to ask how many of those sensors will be needed by the majority of people. It’s kind of an 80/20 thing, and the obvious follow-on question is, what to do about the 80% of sensors that 80% of the people don’t need?
One company presenting at the ITS was Variable; they introduced their Node+ sensor platform, which implicitly contains one answer to this partitioning question. They see phones as taking on the following sensors:
- Ambient Light
And, far from being an 80/20 thing, they see these as satisfying 99% of phone users. But there are still “professionals” that will need sensors beyond these, many being very specialized. Rather than burdening the entire phone with them, they would be external, leveraging the phone as a platform for apps that communicate with the external sensors. They listed the following as opportunities for such sensors (with my parenthetical comments):
- Vibration (phones might theoretically be able to do this, but the sensor ranges and quality may not be sufficient)
- Motion tracking (actually, this is already being done in phones, to some degree…)
- Temperature (although there’s probably a good chance many could use this…)
- Air flow
They’ve done yet another partition by separating out what they call the platform from the sensors themselves. The sensor contains the transducer and analog front-end (AFE). The cleaned-up analog signals are delivered to the platform, which, in conjunction with the phone (attached via BlueTooth) handles analog-to-digital conversion, local processing via firmware on a microcontroller, and an application layer (and possible connection to the cloud). The platform (below), a small cylinder, can accommodate two sensors – one at each end.
(I was hoping to include pictures, but I received no response to my request for permission to use their images.)
The sensors themselves are sold separately (with the exception of a 9-axis motion sensor within the platform). Currently-available modules include (shown in order below), a “luma” module for lighting a scene (seems more of an accessory than a sensor), a remote “therma” thermometer, a weather/environment “clima” sensor, a color-matching “chroma” sensor, and an “oxa” gas detector capable of sensing CO, NO, NO2, Cl2, SO2, and H2S.
Of course, this kind of partitioning may involve more than a technical breakdown of the 99/1 rule. Other critical factors would likely include simple business model considerations (higher revenue by selling attachments) and control: they don’t control what goes into the phone. If phone makers decide that there is value to some of these (like, for instance, the weather sensor) for the 99%, and if they think they can differentiate their phones with them, then a separate module will look less attractive.
So this partitioning is likely to shift with the vagaries of the market.
You can find more at Variable’s website.