feature article
Subscribe Now

Of HIDs and HALs and Hubs

New Pathways and Ambiguous Terms

Those of you in the sensor world are deeply involved with the low-level nuances and intricacies of your devices. How accurate, how linear, how to connect, how to read data, how to fuse data… – there’s so much to think about if you put your mind to it.

Of course, the people you’re doing this for – users of phones and tablets and medical devices and industrial sensors – couldn’t care less about that stuff. They want to sleep soundly knowing that, by hook or by crook, those sensors are detecting their assigned phenomena accurately, and the system is correctly reading those data and munging them into whatever form is necessary to provide a simple, meaningful result at the application level.

And, in between you and that user lies, among other things, the operating system (OS). And OSes are now wise to the ways of sensors, and they’re laying down some rules of the road. We’ve looked at Microsoft’s Human Interface Device (HID) requirements before, but there’s a wrinkle that affects sensors more than other things.

Most HID devices interact with the computer via USB. The original thinking was that we would interact with our computers using peripherals – keyboards, mice, etc, which is largely true. And these peripherals use USB. But sensors are now part of the HID family, and they’re not on USB: they’re typically on I2C or SPI. Those weren’t originally pathways that Windows understood.

One of those pathways has been opened up, however; it’s referred to as “HID over I2C” (it does not cover SPI). In fact, it’s been in place since 2012, although actual support for it is now starting to trickle out. But, as with everything good, there’s a catch: handshake requirements.

Windows expects to be able to get specific information about the sensor from the sensor itself, and many sensors aren’t set up to interface with the Windows protocol. So how do you deal with that?

It turns out that this is yet another role that sensor hubs can take on. I had a conversation with Microchip regarding their new SSC7102 sensor hub, and it provides support for HID over I2C. Which spurred the obvious question, what does that mean?

It means that it can be interrogated by Windows per the protocol. Which you might think strange, since it’s not a sensor itself.  But it can act as a proxy for the sensors attached to it. Even if those individual sensors don’t support Windows directly, the hub can maintain a table that keeps track of them, along with the required bits and bobs that Windows may request at any time as part of the protocol. When requested, the hub responds on behalf of the sensors, and Windows is none the wiser.

Of course, many of the “sensors” that Windows expects to see don’t actually correspond to any specific “real” sensor. They’re virtual, or “composite,” sensors, created through the magic of further data processing or sensor fusion. So “orientation,” for example, is viewed as a sensor, but, in fact, it’s a figment of the fusing of accelerometer and gyroscope and, probably, magnetometer data. The same thing goes for a compass reading: you might expect that it’s just passing along the magnetometer data, but it also needs to use the accelerometer data to compensate for the tilt in the phone (or whatever the device is), and so it is also a composite reading.

One of the main roles of a sensor hub is to perform the calculations that create these composite sensors. So a sensor hub with HID over I2C support can take on the responsibility for accounting to Windows both for all of its attached hard sensors and for the virtual sensors that it generates. In Microchip’s case (and possibly others), you don’t need to write the code that makes this happen; they provide the capability out-of-the-box.

Meanwhile, over in Google-land, Android doesn’t seem to have the same kinds of requirements, or at least it manages them differently via its Hardware Abstraction Layer (HAL). But there’s the suggestion of something subtle and surprising.

The suggestion was made by Movea’s Bob Whyte that Android’s Kit Kat version (4.4) might contain a requirement for hardware sensor hubs. He pointed me to the bit he had noticed. And it gets to a notion Android has of low-power sensors.

There are certain sensors that Android requires be able to operate without the AP remaining awake. One obvious example is the step counter sensor: if you’re going on an hour-long run, they don’t want the AP to have to be on that entire time. But whenever the AP does come on, it should be able to get an up-to-date, accurate count of steps. So these sensors have to run semi-autonomously. The sensors designated as low power are:

  • Geomagnetic rotation vector
  • “Significant motion”
  • Step detector
  • Step counter

These all have to keep running on their own, storing their results in a FIFO buffer, so that when the AP comes back to them, nothing has been lost. (Although I assume there’s a limit, since buffers must have finite sizes…)

All of the low-power sensors are composite sensors. By requiring them to be low-power, the spec is saying that you can’t do the calculations that create the sensor values on the AP; you need to use something external to the AP so that the AP can go to sleep to save power. You might do them in some other core on the same chip as the AP, although many designers are opting for external microcontroller-based hubs because they can be built on less aggressive silicon nodes, making them less leaky and lower in power.

But if you look at the step detector sensor description, for example, you notice an interesting statement: “This sensor must be low power. That is, if the step detection cannot be done in hardware, this sensor should not be defined.”

The sensor needs to be done in… hardware.

That is no small requirement if it means what it looks like it means. Not long ago, we looked at a variety of ways to implement sensor hubs. They involved a mix of hardware and software solutions. While there are hardware solutions out there, the vast majority use software – due largely to the fact that algorithms are still very much in flux and may change even after being deployed in a final system.

So is Google requiring that sensor hubs be implemented in hardware in order to be compliant? That is, does hardware mean “gates” – an ASIC or FPGA? It sure seems to read that way. (At least to a guy that’s spent his career thinking in terms of hardware gates…)

And yet, something about that doesn’t make sense. These things don’t need the raw performance that hardware provides – the timing involved is practically geological compared to what silicon designers are used to having to deal with. And, as just noted, most extant hub implementations are software, involving low-power microcontrollers.

So… would Google really be telling everyone to chuck their MCU-based hubs and go with FPGAs or ASICs? I mean, are they planning to buy an FPGA company or something?*

There’s another possible interpretation of hardware: the sensor itself. An accelerometer is a hardware sensor; an eCompass isn’t, since it’s a composite fused in software. Could that be what they mean?

No, that doesn’t make sense either, since everything on the list is a composite sensor, which, by definition, isn’t hardware.

I asked around a little, and I’m surprised to learn that much of what’s expected here is very hush-hush, protected by NDAs. It’s a public spec, but exactly how to interpret it is, apparently, not public. There are invitation-only meetings where these things get discussed, but no one is willing to go on the record with a clear statement of what the words mean. Even people who have had direct conversations with people who should know definitively what the words mean aren’t comfortable saying publicly that they’ve had such conversations.

That said, let’s just say that I have a strong sense that the two hardware interpretations above – the ones that seem questionable – are indeed questionable. It seems exceedingly unlikely that microcontroller hubs will be deprecated. Perhaps whoever wrote that spec thought of “hardware” as “anything that’s not the AP.”

So, assuming that you can meet the expectations of the HAL with respect to how the sensors and hubs treat the AP, do it any way you want. Software, hardware, vaporw – er – ok, not vaporware, but any combinations of the others will do quite nicely. I reviewed that conclusion with Mr. Whyte, just to bring the discussion full circle, and he agreed.

Finally, to add some perspective: right before publication, Mr. Whyte noted that Microsoft also seems to use the term “hardware” rather loosely in this regard. At his suggestion, I looked at their “Hardware Certification” program, which includes lots of things that are done in software, and the meaning of “hardware” as including software starts to make a bit more sense.

To them, anything that can be done simply by installing software on a PC (which runs on the main CPU) is “software.” Anything that requires someone to plug in some other device, even if that device contains a processor running software, is “hardware.” Which is consistent with an interpretation of “hardware” as “anything outside the AP, even if implemented in software.”

Reminds me of a customer meeting I attended many years ago (when working for a true hardware company, as it turns out). Things got tense – until we realized that we were using the same words for different things. Kind of like the US and the UK…

 

 

*DISCLAIMER: For the record, and for the sake of any legal folks that might have their underwear all twisted up at this moment, no, I’m not saying they are. And if, by mere coincidence, Google is plotting to buy an FPGA company, I have no knowledge thereof (you have to use words like “thereof” for lawyers), and this is not a leak or inside information. It’s mere coincidence. You may stand down now.

 

More info:

Windows HID over I2C

Android 4.4 HAL composite sensors

One thought on “Of HIDs and HALs and Hubs”

Leave a Reply

featured blogs
Apr 23, 2024
The automotive industry's transformation from a primarily mechanical domain to a highly technological one is remarkable. Once considered mere vehicles, cars are now advanced computers on wheels, embodying the shift from roaring engines to the quiet hum of processors due ...
Apr 22, 2024
Learn what gate-all-around (GAA) transistors are, explore the switch from fin field-effect transistors (FinFETs), and see the impact on SoC design & EDA tools.The post What You Need to Know About Gate-All-Around Designs appeared first on Chip Design....
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...

featured video

MaxLinear Integrates Analog & Digital Design in One Chip with Cadence 3D Solvers

Sponsored by Cadence Design Systems

MaxLinear has the unique capability of integrating analog and digital design on the same chip. Because of this, the team developed some interesting technology in the communication space. In the optical infrastructure domain, they created the first fully integrated 5nm CMOS PAM4 DSP. All their products solve critical communication and high-frequency analysis challenges.

Learn more about how MaxLinear is using Cadence’s Clarity 3D Solver and EMX Planar 3D Solver in their design process.

featured chalk talk

FleClear: TDK’s Transparent Conductive Ag Film
Sponsored by Mouser Electronics and TDK
In this episode of Chalk Talk, Amelia Dalton and Chris Burket from TDK investigate the what, where, and how of TDK’s transparent conductive Ag film called FleClear. They examine the benefits that FleClear brings to the table when it comes to transparency, surface resistance and haze. They also chat about how FleClear compares to other similar solutions on the market today and how you can utilize FleClear in your next design.
Feb 7, 2024
10,604 views