editor's blog
Subscribe Now

Simpler MEMS Models for ASIC Designers

Some time back, we took a look at the library of mechanical elements in Coventor’s MEMS+ tool for building MEMS device models. In the “be careful what you wish for” category, making it easier to connect elements into models meant that engineers started connecting more elements into models, and the models got bigger.

Big models can stress a tool out, resulting in slow results and resource starvation.

Well, they’ve just released version 4 of MEMS+, which at the very start, addresses those concerns, enabling quicker handling of more complex models.

But there’s a much more subtle way that they’ve addressed the needs of ASIC designers. Each MEMS element will need an accompanying ASIC to clean up the signals and abstract away a lot of the mechanicalness of the element so that electrical types – or, more likely, digital types – can understand the sensor outputs in their own language.

And, of course, you’re going to want to get started on that ASIC design as soon as possible. But the whole purpose of the ASIC is to turn messy sense element behavior into clean outputs, and in order to do that, you need to know exactly what messy signals you’re going to start with. And you don’t want to wait until the device is finished to do that; you want to model the behavior ahead of time.

The thing is, mechanical folks use finite element analysis and other such schemes for simulation; the ASIC designers will be using Verilog-A. MEMS+ is integrated into Cadence’s Virtuoso tool, so Virtuoso users can actually do their modeling using MEMS+ via a proprietary scheme. But Verilog-A can be used anywhere, and not everyone uses Virtuoso.

What that’s meant in the past is that the MEMS designers have had to hand-craft early Verilog-A models for the ASIC guys to get started on. Those models are tedious to create, and in their effort to keep the task reasonable, things would get left out of the model. And sometimes those left-out things would matter. Which meant that you wouldn’t find out about them until silicon came out, and you’d have to take another turn at the ASIC.

The next step was that MEMS+ could create a full Verilog-A model automatically. This would include all of the non-linearities and such, but it was a huge model, with thousands of “degrees of freedom” (i.e., knobs and variables) and would realistically take far too long to simulate.

So with this release, MEMS+ will let you create a simplified model by selecting specific behaviors and ranges to focus on. These could reflect particular modes or non-linearities of interest. MEMS+ can then fix the other parts, reducing the degrees of freedom from thousands to tens. Which results in a dramatic speedup – like 100X.

This approach can be used on a wide range of sensors – when the movement of the element is a fraction of the surrounding air gap. There’s actually a behavior that is not supported by this model, and it affects some sensors and typically applies to all actuators: it’s called “pull-in.”

The idea is that, when you apply an electrostatic field that pulls on a mechanical element, the element will resist thanks to the mechanical restoring force – essentially, it’s a spring pulling back against the field. But at some point, the field overwhelms the restoring force, and the behavior is no longer linear – the element gets “pulled in” to close the gap.

I sort of picture it this way (if you’re squeamish, you might skip this bit): picture standing some distance in front of an operating airplane jet engine, facing away from it. With earplugs. Good ones. You feel the pull behind you, but you can lean forward and stand your ground like a boss. Feeling brave, you step back a bit. The pull gets stronger, but you man up and show the universe who’s in charge: you are. Yeah, baby. You repeat this, working harder and harder against the engine’s suction, until suddenly, “whoosh.” Um… yeah. Say no more. Non-linear to say the least.

That discontinuity is pull-in. And it’s not included in these simplified models. It’s probably not a good thing to have in a sensor (although you’d want to know if it’s going to be an issue); it’s actually a useful feature for actuators since it gives you a good, positive contact.

One bit of good news with these simplified models: they run independently of MEMS+. So, unlike the Virtuoso-integrated approach, which requires MEMS+ in the background, you don’t need a MEMS+ license to use the simplified model. Obviously a MEMS guy needs a license to create the model, but the ASIC designer doesn’t need a separate license to run it.

You can find out more about MEMS+ 4 in their release.

Leave a Reply

featured blogs
Oct 23, 2020
Processing a component onto a PCB used to be fairly straightforward. Through hole products, a single or double row surface mount with a larger center-line rarely offer unique challenges obtaining a proper solder joint. However, as electronics continue to get smaller and conne...
Oct 23, 2020
[From the last episode: We noted that some inventions, like in-memory compute, aren'€™t intuitive, being driven instead by the math.] We have one more addition to add to our in-memory compute system. Remember that, when we use a regular memory, what goes in is an address '...
Oct 23, 2020
Any suggestions for a 4x4 keypad in which the keys aren'€™t wobbly and you don'€™t have to strike a key dead center for it to make contact?...
Oct 23, 2020
At 11:10am Korean time this morning, Cadence's Elias Fallon delivered one of the keynotes at ISOCC (International System On Chip Conference). It was titled EDA and Machine Learning: The Next Leap... [[ Click on the title to access the full blog on the Cadence Community ...

featured video

Demo: Inuitive NU4000 SoC with ARC EV Processor Running SLAM and CNN

Sponsored by Synopsys

See Inuitive’s NU4000 3D imaging and vision processor in action. The SoC supports high-quality 3D depth processor engine, SLAM accelerators, computer vision, and deep learning by integrating Synopsys ARC EV processor. In this demo, the NU4000 demonstrates simultaneous 3D sensing, SLAM and CNN functionality by mapping out its environment and localizing the sensor while identifying the objects within it. For more information, visit inuitive-tech.com.

Click here for more information about DesignWare ARC EV Processors for Embedded Vision

featured paper

An engineer’s guide to autonomous and collaborative industrial robots

Sponsored by Texas Instruments

As robots are becoming more commonplace in factories, it is important that they become more intelligent, autonomous, safer and efficient. All of this is enabled with precise motor control, advanced sensing technologies and processing at the edge, all with robust real-time communication. In our e-book, an engineer’s guide to industrial robots, we take an in-depth look at the key technologies used in various robotic applications.

Click here to download the e-book

Featured Chalk Talk

Introducing Google Coral

Sponsored by Mouser Electronics and Google

AI inference at the edge is exploding right now. Numerous designs that can’t use cloud processing for AI tasks need high-performance, low-power AI acceleration right in their embedded designs. Wouldn’t it be cool if those designs could have their own little Google TPU? In this episode of Chalk Talk, Amelia Dalton chats with James McKurkin of Google about the Google Coral edge TPU.

More information about Coral System on Module