editor's blog
Subscribe Now

Motion for User Interfaces

precog-no-back-w-clouds.pngWe’ve looked before at ways of controlling machines with just your hands in the air, like you just don’t care. No touchy-feely, no mouse. Just jazz hands.

So at first, when I saw a demo of what we’re going to talk about today, I thought, “OK… this looks kinda like what I was seeing demonstrated a couple years ago by companies like eyesight and PointGrab.” And yet it also had a flavor of what I’d seen with Movea and Hillcrest, except that their technologies involved remote controls doing what just hands were doing in this case.

But what I was seeing wasn’t either of those technologies at work. Making it more confusing yet, this isn’t about a particular sensing technique – optical, touch, whatever. And yet it is about motion and location. While the announced technology may be brand new, you would probably have to use it to sense the difference. I was watching over a screen, so I frankly had to ask a lot of questions to figure out why this wasn’t just another gesture recognition announcement a few years after all the other ones.

I’m talking about Quantum Interface’s new interface called “Qi*.” It’s a way of taking location information and using changes to model motion – and, in particular, to predict where that motion is going and then turn that into information that a user interface can use. The result is, they say, smoother and faster navigation through user interfaces of any kind. Because of the prediction, you don’t have to “complete” motions as much; a little move in a direction will get you where you want to go faster than if you had to, say, track your hand in front of you.

This notion of only location as an input doesn’t involve any gestures. This is not about specifically identifying a gesture – whether static in your hand shape or a motion pattern that a user has to learn. It’s simply about, say, moving your hand or putting a finger on a surface and letting a well-constructed interface make the next movement obvious. Under the hood, the motion is turned into commands: this is specifically the part Qi does do.

It’s often about navigating menus; you move toward a menu that pops open, and then you settle on (or towards) an item and push your finger towards the screen and it takes you to a next-level menu, and so forth. All more quickly and smoothly than older approaches.

But here’s another subtle part: this is a mid-layer piece of technology. It lives above hardware – it will take location information from any system that can provide it, whether touch or optical (gesture or eye tracking or…) or whatever. It improves with multiple location sensors providing inputs.

It’s also not built into any specific user interface (UI): designers of interfaces can tap the information that Qi provides to drive the interface. Quantum Interface has a fair bit of experience using Qi to build UIs, so they do work with their partners in that realm, but that’s about using Qi; it isn’t Qi itself.

This middleness also makes it system-agnostic: you can create a consistent interface for different app platforms – say, phone, watch, and tablet – and tweak only for the details and resources available on that platform. Somewhat like skinning.

Not sure if I’ve said more about what Qi isn’t than what it is, but both are important since the nuances of what’s new are, well, nuanced. You can find more in their announcement.

 

 

*Regrettably, even given China’s large electronics footprint, where they would pronounce that “chee,” and given the wireless power technology Qi, pronounced “chee,” this is not pronounced “chee”: according to the press release, it’s pronounced like its initials, QI (“cue eye”), even though they opted to make the I lower case…

 

Image courtesy Quantum Interface

Leave a Reply

featured blogs
Mar 21, 2023
We explain computational lithography and explore how our partnership with NVIDIA accelerates semiconductor scaling and the chip design flow in the AI age. The post How Synopsys and NVIDIA Are Accelerating Semiconductor Scaling in the AI Age appeared first on New Horizons for...
Mar 20, 2023
Electronic design has evolved over the years to provide methods for optimizing power, space, and energy needs for the most demanding market applications in areas including hyperscale computing, consumer, 5G communications, automotive, mobile, aerospace, industrial, and health...
Mar 10, 2023
A proven guide to enable project managers to successfully take over ongoing projects and get the work done!...

featured video

Level Up Your Knowledge!

Sponsored by Mouser Electronics

Feeling behind in the game? Mouser's newsletter and technical resource subscriptions will ensure that your skills are next level! Set your preferences and customize your subscription to power up your knowledge today!

Click here for more information

featured chalk talk

Direct Drive: Getting More Juice from Your JFET
Sponsored by Mouser Electronics and UnitedSiC
In this episode of Chalk Talk, Jonathan Dodge from UnitedSiC (now part of Qorvo) and Amelia Dalton discuss how you can take full advantage of silicon carbide JFET transistors. They delve into the details of these innovative transistors including what their capacitances look like, how you can control their speed and how you can combine the benefits of a cascode and a directly driven JFET in your next design.
Jun 29, 2022
31,570 views