editor's blog
Subscribe Now

QuickLogic Goes Wearable

We’ve looked at QuickLogic’s sensor hub solution in quite some detail in the past. It’s programmable logic at its heart, but is sold as a function-specific part (as contrasted with Lattice, who sells a general-purpose low-power part into similar applications). QuickLogic recently announced a wearables offering, which got me wondering how different this was from their prior sensor hub offering.

After all, it’s really kind of the same thing, only for a very specific implementation: gadgets that are intended to be worn. Which are battery-powered and require the utmost in power-miserliness to be successful.

You may recall that QuickLogic’s approach is an engine implemented in their programmable fabric. They’ve then put together both a library of pre-written algorithms and a C-like language that allows implementation of custom algorithms; in both cases, the algorithms run on that engine. So the question here is, did the engine change for the wearable market, or is it just a change in the algorithms?

QL_arch.png

Image courtesy QuickLogic

I checked in, and they confirmed that the engine has not changed – it’s the same as for the general sensor hub. What they have done is focus the libraries on context and gesture algorithms most applicable to the wearables market.

Sometime back, we looked at how different sensor fusion guys approach the problem of figuring out where your phone is on you. A similar situation exists for wearables in terms both of classifying what the wearer is doing and the gadget’s relationship to the wearer. QuickLogic’s approach supports 6 different states (or contexts): walking, running, cycling, in-vehicle, on-person, and not-on-person.

They’ve also added two wearable-specific gestures for waking the device up either by tapping it or by rotating the wrist.

Critically, they do this with under 250 µW when active.

You can read more in their announcement.

Leave a Reply

featured blogs
Jul 1, 2022
We all look for 100% perfection and want to turn our dreams (expectations) into reality as far as we can. Are you also looking for a magic wand to turn expectation into reality? The story applies to... ...
Jun 30, 2022
Learn how AI-powered cameras and neural network image processing enable everything from smartphone portraits to machine vision and automotive safety features. The post How AI Helps Cameras See More Clearly appeared first on From Silicon To Software....
Jun 28, 2022
Watching this video caused me to wander off into the weeds looking at a weird and wonderful collection of wheeled implementations....

featured video

Multi-Vendor Extra Long Reach 112G SerDes Interoperability Between Synopsys and AMD

Sponsored by Synopsys

This OFC 2022 demo features Synopsys 112G Ethernet IP interoperating with AMD's 112G FPGA and 2.5m DAC, showcasing best TX and RX performance with auto negotiation and link training.

Learn More

featured paper

3 key considerations for your next-generation HMI design

Sponsored by Texas Instruments

Human-Machine Interface (HMI) designs are evolving. Learn about three key design considerations for next-generation HMI and find out how low-cost edge AI, power-efficient processing and advanced display capabilities are paving the way for new human-machine interfaces that are smart, easily deployable, and interactive.

Click to read more

Featured Chalk Talk

Direct Drive: Getting More Juice from Your JFET

Sponsored by Mouser Electronics and UnitedSiC

In this episode of Chalk Talk, Jonathan Dodge from UnitedSiC (now part of Qorvo) and Amelia Dalton discuss how you can take full advantage of silicon carbide JFET transistors. They delve into the details of these innovative transistors including what their capacitances look like, how you can control their speed and how you can combine the benefits of a cascode and a directly driven JFET in your next design.

Click here for more information about UnitedSiC UF4C/SC 1200V Gen 4 SiC FETs