editor's blog
Subscribe Now

Separating You from Your Phone

In high-school physics class, we did an experiment. It’s so crude by today’s standards, that I feel like something of a fossil as I recall it, but here goes. We had a ticker-tape kind of thing that would make a mark on a paper tape as you pulled the tape through. It marked at a constant frequency, so if you pulled the tape faster, the dots were farther apart. So dot spacing became a measure of speed.

The experiment consisted of two parts. In the first, we held the tape and walked a distance, swinging our arms like normal. In the second, we walked the same distance at the same speed, but holding our arms still.

In the first case, the dots tell a tale of acceleration and deceleration, repeated over and over as our arms moved forward and then backward. The second case showed no such variation; speed was consistent. But the trick was, if you averaged the speeds on the first one, you ended up with the exact same speed as the second one*. Which is obvious with just a little thought: it’s the speed we were actually walking.

This was an early case of, well, not sensor fusion, but, how about if we call it “implied signal extraction.” In this case, there was only one sensor (the tape), which is why there’s no fusion. But in modern times, such extraction might involve fusion.

Here’s the deal: the tape was directly measuring the speed of our hands, when what we were really interested in was the speed of our moving bodies. By averaging the hand movements, we were able to extract the implied body movement signal out of a raw hand movement signal that contained lots of potentially misleading artifacts.

This is happening in spades today in the navigation/orientation business. This will be obvious to the folks that have been trying to manage the problem for a while, but the rest of us may not realize how tough this is. We expect that, with our phones, we now have a way to navigate simply because our phone goes with us.

But put your phone in your hand. Now extend your arm forward: according to the phone, you just moved forward a foot or so. But you didn’t: your arm moved your phone forward; you didn’t go anywhere. Now put your phone in your back pocket, display to the outside. According to your phone, you just turned around. But you didn’t: you turned your phone around as you put it in your pocket. (Heck, the phone might even think you’re standing on your head if you put it in your pocket upside down.)

This drives at the art of orientation-to-trajectory management, a topic I discussed with Movea’s Tim Kelliher at Sensors Expo, and something Movea is working on. Unlike my high school scenario, where, if done right, we’re essentially averaging out a well-controlled sinusoidal movement, our phones go all over the place while we stand in one place. We pick it up, turn it around to orient it properly, switch hands, drop it, put it into one pocket or another, wave it randomly when we try to swat away that bee with our phone-holding hand.

Oh, and we can also do all of this while walking. Or running. Or dancing. Or running in random directions while we try to escape that bee, hands still aflail.

When you think about it, it’s got to be really hard to evaluate all of the sensor inputs on the phone and extract from that a signal that describes how the phone holder is moving. The more I think about it, the more I feel like I would have no idea how to start. Presumably some heuristics would be involved, but even then, it’s not obvious.

For instance, if the proximity sensor is firing, then you might assume that you’re probably on a call, and so conclude that the phone is stationary with respect to the body, up by your ear. That might be right 90% of the time, but then some goofball will, just for sh…ucks and grins, move the phone sultrily up and down along his or her body, keeping it close. The “on a call” heuristic would then decide that we’re walking up and down hills.

So when solutions to this problem are finally announced, I can imagine the aforementioned goofball types to try all kinds of things to see if they can fool the system. Typical silliness, but it also provides clues about how the algorithm works.

For the rest of us, well, let’s not take it for granted. This is a hard problem, and any effective solution will have been hard won.

 

*It actually didn’t work for me; my teacher declared, in frustration, that I needed to learn to walk a consistent speed. Not sure if I’ve mastered that yet; it’s not high on my bucket list…

Leave a Reply

featured blogs
Dec 6, 2023
Optimizing a silicon chip at the system level is crucial in achieving peak performance, efficiency, and system reliability. As Moore's Law faces diminishing returns, simply transitioning to the latest process node no longer guarantees substantial power, performance, or c...
Dec 6, 2023
Explore standards development and functional safety requirements with Jyotika Athavale, IEEE senior member and Senior Director of Silicon Lifecycle Management.The post Q&A With Jyotika Athavale, IEEE Champion, on Advancing Standards Development Worldwide appeared first ...
Nov 6, 2023
Suffice it to say that everyone and everything in these images was shot in-camera underwater, and that the results truly are haunting....

featured video

Dramatically Improve PPA and Productivity with Generative AI

Sponsored by Cadence Design Systems

Discover how you can quickly optimize flows for many blocks concurrently and use that knowledge for your next design. The Cadence Cerebrus Intelligent Chip Explorer is a revolutionary, AI-driven, automated approach to chip design flow optimization. Block engineers specify the design goals, and generative AI features within Cadence Cerebrus Explorer will intelligently optimize the design to meet the power, performance, and area (PPA) goals in a completely automated way.

Click here for more information

featured paper

3D-IC Design Challenges and Requirements

Sponsored by Cadence Design Systems

While there is great interest in 3D-IC technology, it is still in its early phases. Standard definitions are lacking, the supply chain ecosystem is in flux, and design, analysis, verification, and test challenges need to be resolved. Read this paper to learn about design challenges, ecosystem requirements, and needed solutions. While various types of multi-die packages have been available for many years, this paper focuses on 3D integration and packaging of multiple stacked dies.

Click to read more

featured chalk talk

Challenges of Multi-Connectivity Asset Tracking
Multi-connectivity asset tracking is a critical element of our modern supply chain. In this episode of Chalk Talk, Colin Ramrattan and Manuel Cantone from STMicroelectronics and Amelia Dalton discuss the common needs required for asset tracking today, why low power processing is vital for these kind of applications, and how STMicroelectronics ASTRA platform can help you get started on your next asset tracking design.
Feb 20, 2023
34,067 views