editor's blog
Subscribe Now

Motion for User Interfaces

precog-no-back-w-clouds.pngWe’ve looked before at ways of controlling machines with just your hands in the air, like you just don’t care. No touchy-feely, no mouse. Just jazz hands.

So at first, when I saw a demo of what we’re going to talk about today, I thought, “OK… this looks kinda like what I was seeing demonstrated a couple years ago by companies like eyesight and PointGrab.” And yet it also had a flavor of what I’d seen with Movea and Hillcrest, except that their technologies involved remote controls doing what just hands were doing in this case.

But what I was seeing wasn’t either of those technologies at work. Making it more confusing yet, this isn’t about a particular sensing technique – optical, touch, whatever. And yet it is about motion and location. While the announced technology may be brand new, you would probably have to use it to sense the difference. I was watching over a screen, so I frankly had to ask a lot of questions to figure out why this wasn’t just another gesture recognition announcement a few years after all the other ones.

I’m talking about Quantum Interface’s new interface called “Qi*.” It’s a way of taking location information and using changes to model motion – and, in particular, to predict where that motion is going and then turn that into information that a user interface can use. The result is, they say, smoother and faster navigation through user interfaces of any kind. Because of the prediction, you don’t have to “complete” motions as much; a little move in a direction will get you where you want to go faster than if you had to, say, track your hand in front of you.

This notion of only location as an input doesn’t involve any gestures. This is not about specifically identifying a gesture – whether static in your hand shape or a motion pattern that a user has to learn. It’s simply about, say, moving your hand or putting a finger on a surface and letting a well-constructed interface make the next movement obvious. Under the hood, the motion is turned into commands: this is specifically the part Qi does do.

It’s often about navigating menus; you move toward a menu that pops open, and then you settle on (or towards) an item and push your finger towards the screen and it takes you to a next-level menu, and so forth. All more quickly and smoothly than older approaches.

But here’s another subtle part: this is a mid-layer piece of technology. It lives above hardware – it will take location information from any system that can provide it, whether touch or optical (gesture or eye tracking or…) or whatever. It improves with multiple location sensors providing inputs.

It’s also not built into any specific user interface (UI): designers of interfaces can tap the information that Qi provides to drive the interface. Quantum Interface has a fair bit of experience using Qi to build UIs, so they do work with their partners in that realm, but that’s about using Qi; it isn’t Qi itself.

This middleness also makes it system-agnostic: you can create a consistent interface for different app platforms – say, phone, watch, and tablet – and tweak only for the details and resources available on that platform. Somewhat like skinning.

Not sure if I’ve said more about what Qi isn’t than what it is, but both are important since the nuances of what’s new are, well, nuanced. You can find more in their announcement.

 

 

*Regrettably, even given China’s large electronics footprint, where they would pronounce that “chee,” and given the wireless power technology Qi, pronounced “chee,” this is not pronounced “chee”: according to the press release, it’s pronounced like its initials, QI (“cue eye”), even though they opted to make the I lower case…

 

Image courtesy Quantum Interface

Leave a Reply

featured blogs
Jan 22, 2021
Amidst an ongoing worldwide pandemic, Samtec continues to connect with our communities. As a digital technology company, we understand the challenges and how uncertain times have been for everyone. In early 2020, Samtec Cares suspended its normal grant cycle and concentrated ...
Jan 22, 2021
I was recently introduced to the concept of a tray that quickly and easily attaches to your car'€™s steering wheel (not while you are driving, of course). What a good idea!...
Jan 22, 2021
This is my second post about this year's CES. The first was Consumer Electronics Show 2021: GM, Intel . AMD The second day of CES opened with Lisa Su, AMD's CEO, presenting. AMD announced new... [[ Click on the title to access the full blog on the Cadence Community...
Jan 20, 2021
Explore how EDA tools & proven IP accelerate the automotive design process and ensure compliance with Automotive Safety Integrity Levels & ISO requirements. The post How EDA Tools and IP Support Automotive Functional Safety Compliance appeared first on From Silicon...

featured paper

Overcoming Signal Integrity Challenges of 112G Connections on PCB

Sponsored by Cadence Design Systems

One big challenge with 112G SerDes is handling signal integrity (SI) issues. By the time the signal winds its way from the transmitter on one chip to packages, across traces on PCBs, through connectors or cables, and arrives at the receiver, the signal is very distorted, making it a challenge to recover the clock and data-bits of the information being transferred. Learn how to handle SI issues and ensure that data is faithfully transmitted with a very low bit error rate (BER).

Click here to download the whitepaper

Featured Chalk Talk

Selecting the Right MOSFET: BLDC Motor Control in Battery Applications

Sponsored by Mouser Electronics and Nexperia

An increasing number of applications today rely on brushless motors, and that means we need smooth, efficient motor control. Choosing the right MOSFET can have a significant impact on the performance of your design. In this episode of Chalk Talk, Amelia Dalton chats with Tom Wolf of Nexperia about MOSFET requirements for brushless motor control, and how to chooser the right MOSFET for your design.

More information about Nexperia PSMN N-Channel MOSFETs