feature article
Subscribe Now

Sensors in Windows

Microsoft Lays Down the Law

They were late to the party, but they seem to be deciding the menu.

Phones and tablets have accelerated the use of sensors dramatically, pioneered by Apple and Google. Windows 8 has just been released, and yet suddenly Microsoft seems to be calling the shots when it comes to sensor integration in consumer devices and computers.

Why? Well, they are a pretty big elephant, and, even if some might perceive that to make them slow and lumbering (or – gasp! – moving towards irrelevance), they still have a fair bit of leverage in deciding where they get to sit. And, clearly, although the other guys have led the way, Microsoft seems to be saying, “Um, we’ll take over laying out the road forward from here, mkay?” And sensor vendors are listening, making it clear with their new releases that they meet the requirements of the Windows 8 Human Interface Device (HID) spec. So Microsoft is being taken very much seriously.

Originally intended to handle such input peripherals as mice and keyboards, the current HID abstracts the concept of what constitutes a “human”: all sensors, whether or not they reflect input from a human, are managed under the HID umbrella.

At the most basic, the HID is an API. Like with iOS or Android, anyone wanting to make their sensors available on a Windows 8 system must target that API so that application software can access the sensor data.

But it goes beyond just a simple API. There are so many sensors, the number of which keeps increasing, that no system would be successful if it pretended to understand them all. Instead, sensors “describe themselves” to a system via reports that allow the OS to work with the specific sensors in the system.

When the system starts up, the driver needs to know how the sensor will be reporting data; it does this by requesting a “feature report” from the sensor. This defines basic sensor information like the current “reporting interval” (how often the sensor will report data – sort of like reverse polling, if used that way); the minimum reporting interval (the maximum data update frequency for the sensor); the connection type; the sensor sensitivity; and perhaps a human-friendly nickname for the sensor. At the same time, the sensor reports its current state, defining the data fields and value types.

During operation, “input reports” are sent to the driver either asynchronously, based on sensor report timing, or upon request by the driver (presumably based on a request from an application). Again, these reports are self-describing; the data is stored by the driver and, if the report was driven by an application request, is forwarded on to that application.

There are also sensor settings that may change over time – in particular, sensitivity and report frequency. The driver does this by writing a feature report to the sensor – again, usually at the request of an application.

Microsoft has made an attempt to “standardize” certain sensors by grouping them into “categories” and “types.” These, as well as specific sensors from vendors, are identified by a globally unique identifier (GUID). They’ve assigned some; vendors can also assign them.

In creating a sensor taxonomy, they have gone much further than Android has, as shown in Table 1, where the sensors that are officially recognized are listed. In addition to these specific sensors, there are also provisions for “generic” and custom sensors, allowing inclusion of sensors that don’t yet exist or don’t fit into these buckets. Sensors can also be declared individually or in collections.

 20121203_windows8_table1.png

Table 1. Sensors supported by Windows 8, with a comparison against Android. Bold sensors are required by Windows. Android has no required sensors.

Notes:

  1. Not required yet, but may be in the future.
  2. Deprecated.
  3. Required only if a Wireless WAN (WWAN,  like the cellular network) radio exists.

Note that I tried to find out what sensors are supported in iOS, but, unlike Windows and Android, that information doesn’t seem nearly as available. Googling yields numerous books and courses you can pay for, but not much in the way of free information. Welcome to the Apple-controlled universe, I suppose. Maybe I didn’t look hard enough, but if the info is out there, it’s less accessible than that of its competitors.

Microsoft makes a distinction between so-called “physical” sensors – like accelerometers – and so-called “virtual” sensors – like orientation or an inclinometer – that represent data fused from multiple physical sensors. In fact, you might imagine that, as people figure out increasing ways to combine physical sensor inputs to provide new information, the virtual sensor category might be one that grows more quickly than the physical category.

Presumably, data fusion can also take place at the application level. Which gives at least three levels at which fusion can reside: within the sensor itself; in low-level code, yielding a virtual sensor; and as part of an application.

Microsoft has also gone further than the others by requiring a base minimum set of sensors, indicated in bold in the above table. This sets a bar that says that, for any Windows 8 device, regardless of size or usage, there is a base set of sensors that aren’t optional. The assumption would seem to be that every Windows 8 device, regardless of how it’s used, will have to be able to provide, for example, acceleration and rotation information.

If that sensor information is irrelevant for the device at hand, presumably it would be possible to fake Windows out by providing the reports using dummy data – since, clearly, having to provision a system with a nonsensical sensor just for the sake of the OS would be silly with respect to BOM cost – not to mention space. Perhaps the assumption is that, if the system is big enough to run Windows, then the basic sensor set will, through universal harmonic convergence, be relevant.

So, regardless of whether or not you like Windows, or Windows 8 in particular, Microsoft seems to have done a decent job of forcing sensor vendors to take notice. Whether or not the requirements they lay down compromise system architecture openness – and whether or not that’s a good thing – remains to be seen. Right now folks are simply saluting and meeting the requirements.

One thought on “Sensors in Windows”

  1. How do you view Microsoft’s role in specifying sensor interfaces? Useful? No-op? Just part of what needs to be done for getting your sensor supported?

Leave a Reply

featured blogs
Apr 16, 2024
In today's semiconductor era, every minute, you always look for the opportunity to enhance your skills and learning growth and want to keep up to date with the technology. This could mean you would also like to get hold of the small concepts behind the complex chip desig...
Apr 11, 2024
See how Achronix used our physical verification tools to accelerate the SoC design and verification flow, boosting chip design productivity w/ cloud-based EDA.The post Achronix Achieves 5X Faster Physical Verification for Full SoC Within Budget with Synopsys Cloud appeared ...
Mar 30, 2024
Join me on a brief stream-of-consciousness tour to see what it's like to live inside (what I laughingly call) my mind...

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured chalk talk

One Year of Synopsys Cloud: Adoption, Enhancements and Evolution
Sponsored by Synopsys
The adoption of the cloud in the design automation industry has encouraged innovation across the entire semiconductor lifecycle. In this episode of Chalk Talk, Amelia Dalton chats with Vikram Bhatia from Synopsys about how Synopsys is redefining EDA in the Cloud with the industry’s first complete browser-based EDA-as-a-Service cloud platform. They explore the benefits that this on-demand pay-per use, web-based portal can bring to your next design. 
Jul 11, 2023
31,965 views