editor's blog
Subscribe Now

A Dev Board with Both Touch and Gestures

How do people want to interact with their machines? Some of us are most productive with mouse and keyboard (although I keep seeing presentations complaining that we’ve had them for too long – as if we need to get rid of them even if they’re the best tool for some jobs).

Touch has obviously taken over a large number of systems, and for many things, it’s simpler and more obvious (never mind that trying to make it be everything for everyone doesn’t work…). Most importantly, it’s the latest rage, so… well, if it’s cool and popular, then it must be good, right?

Well, look out, touch: the next thing for our ADD world is gesture technology. It’s like touch without the touch. In fact, you can have a “gesture” that’s your pointing finger doing spooky-touch-and-click-at-a-distance, but there’s also the whole gesture vocabulary thing, where different gestures mean different specific things (we’ll dig more into that in the future). And, to hear many folks say it, this is the NEXT next big thing.

Given this history of one mode replacing another mode and then being replaced itself, it’s easy to think of these things as competing. Why would you actually touch a system if you could gesture? Why would you waste money and space on a keyboard if you could touch a virtual keyboard? Well, because different jobs work best with different modalities. They don’t necessarily have to compete.

So it was interesting to see Microchip release a new dev board featuring a touchpad. It’s not a touch dev board; it’s not a gesture dev board: it has both touch and gesture recognition technology built into the touchpad. So you can develop systems that use both approaches – giving developers the opportunity to provide their customers with the best tool for the job.

15472109182_99238954c3.jpg

They call the input device a “3D Touchpad.” In most cases, adding a third dimension to a touchpad means tracking how hard a finger presses. But that’s not what it means here: in this context, it’s the air gestures above the touchpad that account for the third dimension.

The gesture element leverages Microchip’s GestIC technology, which measures the e-field anomalies that your hand creates to decode gestures. Wires embedded in the touchpad, along with their GestIC controller chip, add this capability to the otherwise 2D touchpad. Note that they also support “surface gestures” – gestures swiped on the touchpad.

This isn’t a system per se; it appears to be targeted at developing human-machine interface (HMI) approaches and drivers that would then be integrated into different systems that use the 3D touchpad. The dev kit comes with a free downloadable GUI and an SDK/API package.

You can find more detail in their announcement.

Leave a Reply

featured blogs
Sep 24, 2018
One of the biggest events in the FPGA/SoC ecosystem is the annual Xilinx Developers Forum (XDF). XDF connects software developers and system designers to the deep expertise of Xilinx engineers, partners, and industry leaders. XDF takes place in three locations this year.  Sa...
Sep 24, 2018
For the second year, the Electronic Design Process Symposium (EDPS) took place in Milpitas, having been at Monterey for many years. This was apparently the 25th year EDPS has run. I find EDPS to be a fascinating conference, and I think it is a shame that more people don'...
Sep 21, 2018
  FPGA luminary David Laws has just published a well-researched blog on the Computer History Museum'€™s Web site titled '€œWho invented the Microprocessor?'€ If you'€™re wildly waving your raised hand right now, going '€œOoo, Ooo, Ooo, Call on me!'€ to get ...
Sep 20, 2018
Last week, NVIDIA announced the release of the Jetson Xavier developer kit. The Jetson Xavier, which was developed in OrCAD, is designed to help developers prototype with robots, drones, and other......