editor's blog
Subscribe Now

Motion for User Interfaces

precog-no-back-w-clouds.pngWe’ve looked before at ways of controlling machines with just your hands in the air, like you just don’t care. No touchy-feely, no mouse. Just jazz hands.

So at first, when I saw a demo of what we’re going to talk about today, I thought, “OK… this looks kinda like what I was seeing demonstrated a couple years ago by companies like eyesight and PointGrab.” And yet it also had a flavor of what I’d seen with Movea and Hillcrest, except that their technologies involved remote controls doing what just hands were doing in this case.

But what I was seeing wasn’t either of those technologies at work. Making it more confusing yet, this isn’t about a particular sensing technique – optical, touch, whatever. And yet it is about motion and location. While the announced technology may be brand new, you would probably have to use it to sense the difference. I was watching over a screen, so I frankly had to ask a lot of questions to figure out why this wasn’t just another gesture recognition announcement a few years after all the other ones.

I’m talking about Quantum Interface’s new interface called “Qi*.” It’s a way of taking location information and using changes to model motion – and, in particular, to predict where that motion is going and then turn that into information that a user interface can use. The result is, they say, smoother and faster navigation through user interfaces of any kind. Because of the prediction, you don’t have to “complete” motions as much; a little move in a direction will get you where you want to go faster than if you had to, say, track your hand in front of you.

This notion of only location as an input doesn’t involve any gestures. This is not about specifically identifying a gesture – whether static in your hand shape or a motion pattern that a user has to learn. It’s simply about, say, moving your hand or putting a finger on a surface and letting a well-constructed interface make the next movement obvious. Under the hood, the motion is turned into commands: this is specifically the part Qi does do.

It’s often about navigating menus; you move toward a menu that pops open, and then you settle on (or towards) an item and push your finger towards the screen and it takes you to a next-level menu, and so forth. All more quickly and smoothly than older approaches.

But here’s another subtle part: this is a mid-layer piece of technology. It lives above hardware – it will take location information from any system that can provide it, whether touch or optical (gesture or eye tracking or…) or whatever. It improves with multiple location sensors providing inputs.

It’s also not built into any specific user interface (UI): designers of interfaces can tap the information that Qi provides to drive the interface. Quantum Interface has a fair bit of experience using Qi to build UIs, so they do work with their partners in that realm, but that’s about using Qi; it isn’t Qi itself.

This middleness also makes it system-agnostic: you can create a consistent interface for different app platforms – say, phone, watch, and tablet – and tweak only for the details and resources available on that platform. Somewhat like skinning.

Not sure if I’ve said more about what Qi isn’t than what it is, but both are important since the nuances of what’s new are, well, nuanced. You can find more in their announcement.

 

 

*Regrettably, even given China’s large electronics footprint, where they would pronounce that “chee,” and given the wireless power technology Qi, pronounced “chee,” this is not pronounced “chee”: according to the press release, it’s pronounced like its initials, QI (“cue eye”), even though they opted to make the I lower case…

 

Image courtesy Quantum Interface

Leave a Reply

featured blogs
May 20, 2019
At the 2019 International Symposium of Physical Design, the conference honored Alberto Sangiovanni-Vincentelli with a lifetime achievement award. Alberto was one of the cofounders of SDA Systems, the... [[ Click on the title to access the full blog on the Cadence Community s...
May 17, 2019
The Design Automation Conference (DAC) is the premier conference for automated electronics design and verification technology. For 2019, DAC returns to sunny Las Vegas, Nevada at the Las Vegas Convention Center from June 2-5, 2019. We'€™ve packed each day full of exciting ...
May 17, 2019
In the days of old we looked into the “green” for guidance on how much further down into the world of miniaturization we could go. What is the green you ask? I am talking about the substrate that has served us all well for many years; the PCB. We are at a crossroa...
Jan 25, 2019
Let'€™s face it: We'€™re addicted to SRAM. It'€™s big, it'€™s power-hungry, but it'€™s fast. And no matter how much we complain about it, we still use it. Because we don'€™t have anything better in the mainstream yet. We'€™ve looked at attempts to improve conven...