feature article
Subscribe Now

Next-Generation Motion Sensing Facilitates Immersive Spatial Audio

I was sitting here thinking (don’t worry, I’ll try not to make a habit of it). I was contemplating how, even when we think things have evolved to be rather good (technologically speaking), someone invariably leaps onto the center of the stage with a fanfare of flugelhorns, brandishing a “new and improved” solution.

This just happened to me twice in one day (I’m too young for all this excitement). The person to blame is Chad Lucien, Vice President and General Manager of the Sensor and Audio Business Unit at Ceva.

As a reminder, Ceva specializes in licensable intellectual property (IP) cores, tools, and software that fall into several key categories: DSP cores that are designed for tasks like audio, voice, radar, sensor fusion, and baseband processing; artificial intelligence (AI) and artificial neural network (ANN) cores specialized for edge AI and neural network inference; wireless connectivity IPs for standards like Bluetooth, Wi-Fi, UWB (ultra-wideband), and NB-IoT, LTE-M, etc.; and sensor fusion and sound processing.

I don’t know about you, but whenever someone says “Ceva,” I always think of its wide range of silicon IP that is primarily used in SoCs (Systems-on-Chip), but that can also be deployed in FPGAs and DSP (Digital Signal Processing)-based platforms, depending on the application and the integration flow.

What I tend to forget is that the guys and gals at Ceva also have a proud portfolio of embedded application-centric software IP products that they license to OEMs, and it’s two of these entities about which I wish to waffle.

Let’s start with Ceva’s MotionEngine, which is a software-based sensor fusion engine designed to enable precise motion tracking, orientation, and activity recognition using data from inertial and environmental sensors (e.g., accelerometers, gyroscopes, magnetometers, barometers).

Consider a TV remote controller, for example. In this case, a typical arrangement would be to have a MEMS (micro-electromechanical systems)-based IMU (Inertial Measurement Unit) mounted in the controller. The MotionEngine software can be deployed directly on the controller, in which case the controller processes the raw sensor data and wirelessly transmits actionable information to the TV. Alternatively, the MotionEngine software can reside in the TV itself, in which case the controller would transmit the raw sensor data to the TV for processing.

Just a few days ago, as I pen these words, the chaps and chapesses at Ceva launched their latest offering in this arena in the form of MotionEngine Hex. This encompasses a full product line for TV and STB (Set-Top Box) remotes, pens, presenters, game controllers, and more. Once again, this can be deployed in the embedded system (e.g., a TV remote controller) or in the host system (e.g., a TV).

The term 6-DoF (Six Degrees of Freedom) refers to the six independent ways in which a rigid body can move in 3D space. These are grouped into the following:

Three Rotational Movements (Angular)

  • Roll: Rotation around the X-axis (tilting side to side)
  • Pitch: Rotation around the Y-axis (tilting forward/backward)
  • Yaw: Rotation around the Z-axis (turning left/right)

Three Translational Movements (Linear)

  • Surge: Forward/backward movement (along the X-axis)
  • Sway: Left/right movement (along the Y-axis)
  • Heave: Up/down movement (along the Z-axis)

I find it helpful to think of these in the context of wearing a virtual reality (VR) or augmented reality (AR) headset. In this case, the rotational movements would be associated with me tilting or turning my head, while the transitional movements would be associated with me moving my body (e.g., walking forward and crouching down).

3-DoF vs. 6-DoF (Source: Ceva)

As its name might suggest, MotionEngine Hex tracks 6-DoF. However, this does not imply that it employs only an IMU with a 3-axis accelerometer and a 3-axis gyroscope. No siree, Bob! In fact, MotionEngine Hex integrates data from a 3-axis accelerometer, a 3-axis gyroscope, and a UWB-based position sensor to compute full 6-DoF position + orientation.

The addition of the UWB “axis” dramatically increases the temporal resolution and spatial accuracy of the system. For example, it can be used to accurately detect the direction in which the remote is pointing, thereby enabling “point-and-control” interactions (think using a laser pointer, but without the laser part).

As an aside, my wife (Gigi the Gorgeous) and I know two identical sisters of advanced years who subscribe to every conspiracy theory going, and who are best described as “technophobic hypochondriacs.” For example, they are card-carrying members of the self-identified group of individuals who suffer from Electromagnetic Hypersensitivity (EHS), and they don’t like to suffer quietly or alone. Thankfully, this abstract affliction prevents them from visiting our house, which is awash in electromagnetic waves.

Of course, I could inform the sisters (but not convince them) that UWB operates just above the “noise floor” (think of the hum or static that’s always present to some degree, even when nothing useful is happening), and is much lower than things like Bluetooth and Wi-Fi in terms of its power spectral density.

GPS, Bluetooth, Wi-Fi, etc. vs. UWB in terms of power spectral density (Source: Ceva)

There are numerous implications to all of this, not least that it opens the door for a new raft of user interaction options (remember that the lads and lasses at Ceva don’t implement any user interactions themselves; instead, they provide the position + orientation data to the OEMs to make of it what they will, as it were).

What types of user interactions are we talking about? Well, let’s stick with a TV remote as our example. You know when you’re watching a program and you want to fast-forward or rewind, so you access the progress bar (also known as the scrubber bar, seek bar, seek slider, or playback bar). You then enjoy a frustrating few minutes cycling through the 2x, 4x, 8x, 16x, 32x fast-forward/rewind multiples, overshooting and undershooting until you finely hone in on your desired destination. By comparison, assuming a remote equipped with MotionEngine Hex, imagine having the ability to simply press, hold, and drag the slider to your intended location. Even better, imagine simply “scrubbing” the slider off-screen to commence the next (or previous) episode.

Similarly, when setting the volume, rather than repeatedly pressing the “Volume Up” and “Volume Down” buttons, consider simply pressing the “Volume” button and then smoothly adjusting the volume by raising or lowering your hand (while holding the remote, of course).

Now, let’s turn our attention to Ceva’s RealSpace, which is a spatial audio and head-tracking software suite that delivers immersive 3D audio experiences across various devices, including earbuds, headphones, XR headsets, soundbars, and mobile/PC platforms.

RealSpace creates a virtual placement of sound anywhere in 3D space with pinpoint accuracy. Of particular interest in the context of our discussions here is that RealSpace can take full advantage of Ceva’s MotionEngine, which delivers accurate head orientation and head tracking to provide a more realistic audio experience.

Can I give you an example of what I’m talking about? You bet your booties I can! Have you heard tell of boAt Lifestyle? This is an Indian company that designs and markets consumer electronics, with a focus on audio products like earphones, headphones, and speakers. This entity has grown to become a leading player in the Indian audio market and a top wearable brand globally.

The reason I mention this here is that boAt’s Nirvanna Eutopia wireless headphones feature Ceva’s RealSpace and MotionEngine technologies. Unfortunately, you can’t purchase these little beauties in the USA. Fortunately, I know a man who knows a man who… which explains the set sitting on my desk in front of me as illustrated below.

Meet my boAt Nirvana Eutopia spatial audio headphones (Source: Clive “Max” Maxfield)

At first, when you place these headphones on your pate, apart from being super comfortable and offering an awesome audio experience, they don’t seem radically different from other headphones you may have used in the past. And then you press the “3D Spatial Audio” button and start in surprise as your socks are blown off your feet. Now, when you turn your head, the source(s) of the sound appear to maintain their location(s) in 3D space (where no one can hear you scream).

Take a simple audiobook, for example. By default, it sounds as though the narrator is situated in front of you. If you turn your head to the left when spatial audio is engaged, the sound will appear to come from your right; if you turn your head to the right, the sound will appear to come from your left

Now, imagine wearing these while playing a shoot-em-up game on your PC. Suppose you hear shouting and shooting to your left. When you turn your head, these sounds will appear to be coming from in front of you, thereby increasing your sense of realism. Things are even more amazing when you use these headphones in conjunction with a VR/AR headset like the Quest 3.

This is an example of what I was waffling about at the beginning of this column. I used to think that traditional headphones were pretty good when all they did was feed sound into my ears (“That’s so 20th Century, my dear!”). Prior to becoming aware of Ceva’s spatial audio technology, I had never considered the concept of headphones that were spatially aware. Now, both my headphones and I are enjoying a higher state of awareness (“Welcome to the 21st Century, we hope you’ll enjoy your stay”).

But I fear we are in danger of bumbling into the brume, so let’s try to bring everything back home. Chad (do you remember Chad from the beginning of this column?) informed me that Ceva has just announced a new collaboration with nothing. Sorry, I meant to say that Ceva has just announced a new collaboration with Nothing, the London-based consumer technology company known for its transparent, design-forward smartphones, audio products, and wearables that blend style with performance. 

By leveraging Ceva’s industry-leading spatial audio technology and premium wireless audio SoCs powered by Ceva’s Bluetooth IP, Nothing aims to redefine how users engage with music, movies, and games on wireless audio devices.

As part of this, they are adding Ceva’s RealSpace spatial audio software to Nothing’s upcoming audio lineup, including the newly announced Nothing Headphone (1), which is the company’s highly anticipated first entry into the over-ear style audio device.

Oooh, tasty!

I wonder if a pair of these bodacious beauties is lurking in my future (I certainly hope so). In the meantime, as always, I’d love to hear your thoughts on any of this. I welcome your captivating comments, insightful questions, and sagacious suggestions.

One thought on “Next-Generation Motion Sensing Facilitates Immersive Spatial Audio”

Leave a Reply

featured blogs
Jul 25, 2025
Manufacturers cover themselves by saying 'Contents may settle' in fine print on the package, to which I reply, 'Pull the other one'”it's got bells on it!'...

featured paper

Agilex™ 3 vs. Certus-N2 Devices: Head-to-Head Benchmarking on 10 OpenCores Designs

Sponsored by Altera

Explore how Agilex™ 3 FPGAs deliver up to 2.4× higher performance and 30% lower power than comparable low-cost FPGAs in embedded applications. This white paper benchmarks real workloads, highlights key architectural advantages, and shows how Agilex 3 enables efficient AI, vision, and control systems with headroom to scale.

Click to read more

featured chalk talk

Wi-Fi Locationing: Nordic Chip-to-Cloud Solution
Location services enable businesses to gather valuable location data and deliver enhanced user experiences through the determination of a device's geographical position, leveraging specific hardware, software, and cloud services. In this episode of Chalk Talk, Amelia Dalton and Finn Boetius from Nordic Semiconductor explore the benefits of location services, the challenges that WiFi based solutions can solve in this arena, and how you can take advantage of Nordic Semiconductor’s chip-to-cloud locationing expertise for your next design.
Aug 15, 2024
59,675 views