feature article
Subscribe Now

You Can’t Touch This: New Sensor Edition

Startup UltraSense uses Ultrasound to Replace Conventional Touch Sensors

“And he that strives to touch the stars oft stumbles at a straw.” – Edmund Spenser

Touch-screen interfaces are cool and sexy and very much de son temps. At first, they were novel and hard to build. Then, chipmakers made it easy by producing lots of different touch solutions. Now, they’re getting hard again. 

The problem is space. And 5G antennas. And metal. And aesthetics. Cellphones are really skinny, covered in metal, full of holes, and filled with RF broadcasters. Trouble is, millimeter wave antennas don’t like to be behind metal, so you put them under glass, which isn’t as bendy as aluminum. And external switches and headphone jacks are so last century, so you plug all the metal holes, too. 

Over in the white goods aisle, refrigerators, ranges, and cooktops are really big and covered in stainless steel, glass, enameled metal, or ceramic depending on local design tastes. They all have touch interfaces, too, but with different ones for each different material. It’s a nuisance, but who ever said being beautiful was easy? 

Turns out, you can solve all these First World Problems by rethinking your touch-interface technology. Instead of using capacitive or piezoelectric sensing, go ultrasound. Ultrasound doesn’t care what material you’re using, how thick it is, how conductive it is, how it flexes, how it’s shaped, or how little space you might have. In fact, ultrasound just don’t care much at all. 

That material agnosticism is very handy when you’re making a variety of different interfaces for different products. It works through any reasonable material of any thickness, and it doesn’t require lengthy calibration during the production process. 

UltraSense is a 20-person startup working on a single-chip commercial ultrasound sensor. Its first-generation TouchPoint chips are already sampling, and the first end-user products to incorporate the sensors are due next summer. Second-generation chips with added functionality will follow shortly thereafter. 

From a designer’s perspective, the sensors couldn’t be easier to use. The tiny, 1.4×2.4 mm package has connections for power and a serial interface. That’s it. You point the top of the chip at the front of your device and wait for data to come back. 

The sensor works by detecting changes in “acoustic impedance.” TouchPoint transmits an ultrasound beam though whatever material(s) it’s mounted to and waits for a reflection. Or, more accurately, for a change in the standard reflection. Usually, that’s an approaching finger. TouchPoint can distinguish between light and heavy finger presses, and it works through leather gloves, rain, dirt, grease, and other contaminants that confuse most other touch technologies. The one thing that does confound it is woolen mittens. At least, for now. 

TouchPoint is tuned to ignore solid materials (metal, glass, wood, etc.) and detect the big impedance mismatch of air. That means it can be used behind a stack of materials (glass over metal, for example) so long as there are no air gaps in the stack. It’s also how TouchPoint distinguishes between firm and gentle finger presses: it senses the amount of air in the user’s fingerprint. A lot of air (relatively speaking) means a light press; less air means a squished finger and a firmer press. Loosely woven textiles – e.g., knitted mittens – contain a lot of air and confuse the sensor. So maybe it’s not the right technology for snowmobiles, but, apart from that, it’s nearly universal. 

The sensor chip itself contains UltraSense’s proprietary MEMS transducer, an unnamed MCU, nonvolatile memory, voltage regulators, and various analog circuitry. The company supplies firmware for different materials and GUI activities. For example, it can distinguish between single, double, and multiple taps, tap-and-hold, and slider actions.  

What TouchPoint can’t do is replace a large trackpad. Its ultrasound transducer has a very narrow field of view, which is great for closely spaced virtual buttons but not for tracking large-scale motions like tracing words or sketching patterns. It takes multiple TouchPoint chips in a row to create a virtual slider. Making a virtual trackpad would require a grid of TouchPoint chips, which probably wouldn’t be cost effective. On the other hand, if you’re building a UI into a wooden table or behind chromed metal, you don’t have a lot of other choices. It makes sense. 

Leave a Reply

featured blogs
Nov 30, 2023
No one wants to waste unnecessary time in the model creation phase when using a modeling software. Rather than expect users to spend time trawling for published data and tediously model equipment items one by one from scratch, modeling software tends to include pre-configured...
Nov 27, 2023
See how we're harnessing generative AI throughout our suite of EDA tools with Synopsys.AI Copilot, the world's first GenAI capability for chip design.The post Meet Synopsys.ai Copilot, Industry's First GenAI Capability for Chip Design appeared first on Chip Design....
Nov 6, 2023
Suffice it to say that everyone and everything in these images was shot in-camera underwater, and that the results truly are haunting....

featured video

Dramatically Improve PPA and Productivity with Generative AI

Sponsored by Cadence Design Systems

Discover how you can quickly optimize flows for many blocks concurrently and use that knowledge for your next design. The Cadence Cerebrus Intelligent Chip Explorer is a revolutionary, AI-driven, automated approach to chip design flow optimization. Block engineers specify the design goals, and generative AI features within Cadence Cerebrus Explorer will intelligently optimize the design to meet the power, performance, and area (PPA) goals in a completely automated way.

Click here for more information

featured webinar

Rapid Learning: Purpose-Built MCU Software Tools for Data-Driven Embedded IoT Systems

Sponsored by ITTIA

Are you developing an MCU application that captures data of all kinds (metrics, events, logs, traces, etc.)? Are you ready to reduce the difficulties and complications involved in developing an event- and data-centric embedded system? This webinar will quickly introduce you to excellent MCU-specific software options for developing your next-generation data-driven IoT systems. You will also learn how to recognize and overcome data management obstacles. Register today as seats are limited!

Register Now!

featured chalk talk

Spectral and Color Sensors
Sponsored by Mouser Electronics and ams OSRAM
There has been quite a bit of advancement in the world of spectrometers of the last several years. In this episode of Chalk Talk, Amelia Dalton and Jim Archibald from ams OSRAM investigate how multispectral sensing solutions are driving innovation in a variety of different fields. They also explore the functions involved with multispectral sensing, the details of ams OSRAM’s AS7343 spectral sensor, and why smoke detection is a great application for this kind of multispectral sensing.
Mar 6, 2023
32,609 views