feature article
Subscribe Now

Sensors Get Platforms

Reference Kits Abstract the Details

New technology follows an arc. If it’s something really new and different, then it typically starts with an inspiration or an insight into how to do something really hard. The focus of all effort is then on doing whatever hard thing makes the New Technology possible.

Users of the New Technology tend to be early adopters – folks that can take a novel data sheet, figure out what it all means, and design whatever is necessary to integrate the New Thing into a system.

For sensors, you might imagine the original ur-sensors (many, many years ago) as providing only analog data. And the sensor maker is heads-down perfecting how that data is acquired, how accurate it is, and how stable and resilient the sensor is. Everything else – converting the data to digital, formatting it, communicating it – all of that stuff has to be done by the system designer.

After this goes on for a while, a couple of things might happen. If not enough demand materializes, the New Technology might wither and die, becoming yet another Silicon Valley “Remember When?” tale told in one of the few remaining old bars that haven’t been mowed down to make room for condos.

On the other hand, things might pick up, with competition sprouting all around, and now it’s no longer enough just to make your sensor and spew some data. You have to do it better than someone else. And one of the ways to keep ahead and to avoid the dying of the technology is to make it easier to use. Bring the digital conversion into the sensor. Put it on a popular bus. Give it interrupts.

This makes the technology accessible to a much broader range of customers, including those that don’t want or have time to experiment or learn from scratch how to implement some New Thing.

That then becomes the new normal, and functionality starts to settle out and even succumbs to standardization.

At which point optimization starts to kick in as the added value. How fast does it run? How much or how little power is required? How big is it? How much integration is there? There’s no end to the variety of configurations that might be proposed during this phase.

Sensors are in the thick of this now, with competing combinations of sensors and hubs. Smartphones have garnered most of the attention, but wearables and other consumer gadgetry are starting to steal some of those brain cycles, and the optimizations for one may not apply to the other.

So we’ve gone from system designers starting with an analog value and having to do everything else themselves to being able simply to read a digital sensor value. But now there are endless choices over how to configure the sensor subsystem. And the proliferation of sensors has meant that having a sensor is now no big thing. Everyone has them. And everyone wants to use them.

Which means lots of potential customers that don’t care a whit about how sensors work or interconnect or anything. All they care is that they fit, that they do what they’re told, and – critically – that they go easy on battery life.

This is the stage where platforms emerge: configurable subsystems where most of the work has been done and, as the system designer, you provide the high-level settings that apply to your particular system and you’re good to go.

We seem to have arrived at this stage with sensors. As CES was approaching (which I skipped, being the wimp that I am), suddenly a raft of platforms found their way into my inbox. It surprised me: it wasn’t like there was one, then a long delay, and another, and more months, and then a few more. No – four of them in less than a month, all targeting the pre-CES hail of press releases.

There are, of course, two goals to such platforms. The altruistic one is that sensor technology is made accessible to a much broader range of developer, and, in the aggregate, less engineering time is spent by dozens and hundreds of engineers all repeating the same work, with minor variations.

The practical goal is that, by providing a platform, you rope empower a designer to use your sensor or other enabling technology.

As a developer, having a platform like that can dramatically simplify life – as long as you want to stay within the parameters of the platform. Looking for control over the details? Then a platform may or may not help. If the platform is relatively open, then you can use it as a reference design – a starting point from which you can then deviate, swapping components and designing a custom board. If you’re still using the provider’s sensor, then they may support you in your efforts. If not, then… you’re on your own (or you’ll need to get support from whomever you switched to).

So here’s what came over the transom in December and early January, in chronological order of my receiving them.

First off, Quintic. They’ve got a platform that they’re targeting at the wearable market in particular. Now… while you might think of a platform as a PC board with off-the-shelf devices pre-chosen and pre-configured for you, think again. This is a bigger commitment than that: it’s silicon. A chip called the QN9020. It contains the following blocks:

  • Cortex M0 (32 MHz)
  • Memory (volatile  and non-)
  • Bluetooth Smart (BT-LE was an enabling technology); 2.4 MHz, -95 dBm receiver sensitivity
  • Housekeeping stuff: DC-DC; ADC; comparator; timer; oscillator/clock; GPIOs; busses (UART, I2C, SPI)

You’ll notice one major missing item: sensors. You attach your sensors. They have software to fuse them, although some people use third-party software. Primary support so far is for ST and InvenSense, per customer request.

A system built from this consists of the 9020, the sensor(s), and a battery. Some folks use a chip antenna. Power is 25 mW peak, which they say is best in class (with the closest competitor being around 35 mW). The peak radio current, which is the thing that can kill the battery, is 8 mA.

Packaging options include tiny: CSP and bare die. This is an important consideration for wearables, since the circuitry may need to be exceedingly small, even sewn into clothing. Flexible cable may be needed to run through fabric.

Looking forward, their next round will take power down to 10 mW, which is low enough to start integrating with a solar harvester. Beyond that, with increased efficiency, they want to be able to work with piezos in shoes, for example. But that’s, like, 3 years out. Organic approaches to flexible circuits and such are even further out – they’re not really on the Quintic radar.

Next, Movea and TI announced a smart wristwatch platform. It combines TI’s Bluetooth LE radio with Movea’s sensor fusion algorithms for:

  • Detecting the wearer’s posture: standing, sitting, walking, or running;
  • Monitoring and classifying the wearer’s activity: counting steps, calories, and miles/kilometers
  • Over-optimizing – er – optimizing sports activity: how fast and efficiently you’re running or biking
  • Monitoring sleep: timing and deconstructing sleep cycles.

They claim to have achieved a >95% success rate when classifying the above activiites; they say that their step count algorithm had the fewest errors when compared to other commercial solutions; and they say that their sleep analysis compared favorably with the medical gold standard.

This is more of a reference design kit, and it includes an API, Movea’s fusion algorithms, and TI’s BT-LE technology in a wristwatch design done by Xm-Squared.

Next was PNI Sensor. They’re leveraging their SENtral motion coprocessor, combining it with various different sensors to offer complete motion modules (they call them Sentral M&M for “Motion and Measurement”).

This is a pure work-saving play. Their idea is to abstract away all of the nuances of working directly with sensors (things like calibration, drift, dealing with noise and anomalies, etc.). They have six different modules based on the same basic specs with different combinations of sensors. And they’re color-coded:

  • White has no sensors (it’s just the SENtral chip);
  • Red combines an InvenSense six-axis motion combo and an AKM Hall-effect sensor;
  • Green is like Red, but it replaces the InvenSense sensor with an STMicroelectronics one;
  • Orange is similar, but with a Bosch Sensortec six-axis sensor instead of the InvenSense or ST;
  • Blue is like Green, but with the AKM swapped out for a PNI magneto-inductive sensor; and
  • Yellow has an ST 9-axis inertial module.

Finally, MEMSIC announced a watch platform created through collaboration with Meta Watch. The sensor focus here is the integration of MEMSIC’s low-power magnetometer, endowing the watch with e-compass functionality. There’s not much information available on this yet, so we’ll have to wait for details.

It’s entirely to be expected that more of these will show up. Which means that designers won’t have to be confused over which sensors to use; they’ll merely be confused by which reference design or kit to use.

But it’s all part of the natural process that takes low-level technology and abstracts away the tough bits (unfortunately, the stuff that the engineers behind it might be proud of) and refocuses on what system designers need in order to complete their designs as quickly as possible.

Of course, as this becomes the rule, sensor and software makers end up selling less to the designer and more to the potential kit partner. That’s not so much the case yet, as exemplified by PNI in particular, where the purchaser starts with specific sensors in mind and picks the appropriate (color) kit. The more designers get distanced from specific sensors, however, the more the sensors will be sold through the kits in certain markets. That makes for kingmakers and business deals. And we all know that those often trump technology when it comes to business success.

So change is afoot. Expect to see more of these here and elsewhere…


More info where available:

Meta Watch

Movea (microsite under development; I’ll update link when I get it)

PNI Sensor Sentral M&M



One thought on “Sensors Get Platforms”

Leave a Reply

featured blogs
Dec 5, 2022
It's a new week in CFD and time to refresh our memories about what's going on here at Cadence Fidelity CFD. From The Blogs Cadence and McLaren Accelerate Innovation During a recent visit to the Cadence office in Austin, Texas, McLaren CEO Zak Brown discussed the Cad...
Nov 30, 2022
By Chris Clark, Senior Manager, Synopsys Automotive Group The post How Software-Defined Vehicles Expand the Automotive Revenue Stream appeared first on From Silicon To Software....
Nov 30, 2022
By Joe Davis Sponsored by France's ElectroniqueS magazine, the Electrons d'Or Award program identifies the most innovative products of the… ...
Nov 18, 2022
This bodacious beauty is better equipped than my car, with 360-degree collision avoidance sensors, party lights, and a backup camera, to name but a few....

featured video

Maximizing Power Savings During Chip Implementation with Dynamic Refresh of Vectors

Sponsored by Synopsys

Drive power optimization with actual workloads and continually refresh vectors at each step of chip implementation for maximum power savings.

Learn more about Energy-Efficient SoC Solutions

featured chalk talk

Matter & NXP

Sponsored by Mouser Electronics and NXP Semiconductors

Interoperability in our growing Internet of things ecosystem has been a challenge for years. But the new Matter standard is looking to change all of that. It could not only make homes smarter but our design lives easier as well. In this episode of Chalk Talk, Amelia Dalton and Sujata Neidig from NXP examine how Matter will revolutionize IoT by increasing interoperability, simplifying development and providing a comprehensive approach to security and privacy. They also discuss what the roadmap for Matter looks like and how NXP’s Matter reference platforms can help you get started with your next IoT design.

Click here for more information about NXP Semiconductors Development Platforms for Enabling Matter Devices