feature article
Subscribe Now

AI at the Edge? Meet Wi-Fi in Microwatts

I’ve just been introduced to a technology that has the potential to revolutionize battery- and ambient-powered AI-enabled sensors using Bluetooth or WiFi while consuming only 1/1000th the power of regular Bluetooth and WiFi implementations. Intrigued? Read on!

Before we plunge into the fray with gusto and abandon, indulge me, if you will. I read a lot of science fiction. One of the books I read recently stuck in my mind more than most. I’m talking about The Last Human by Zack Jordan. This is packed with myriad mindboggling ideas, all competing for the reader’s attention. The story is set in a far future dominated by “The Network.” This is a galaxy-spanning collective of millions of alien species connected through implants, sharing knowledge, culture, and peaceful coexistence.

The Network is an always-on super-AI-based information and governance layer that connects trillions of beings—biological and artificial—into a shared ecosystem. Every species and individual has an associated “tier” (a rating of intelligence/capability).

  • Tier 1 = barely sapient
  • Tier 2 & 3 = normal biological species
  • Tier 4 = superintelligences / AIs
  • Tier 5 = near-godlike entities

This tier system isn’t a neat numerical scale (although words like “exponential” and “logarithmic” do spring to mind), but rather a set of huge qualitative leaps in cognition, where each tier represents an entirely new class of mind rather than a simple multiplier.

Tier 4 beings and above—superintelligent artificial minds—don’t just use the Network; they live inside it. They link through high-bandwidth direct interfaces, shared computation layers, distributed processing, and collective awareness zones. Tier 4 minds can jointly reason across vast distances, spawn sub-processes, merge or split, run simulations, and influence the lower tiers indirectly through “guidance.”

Their thinking is faster than biological time, so from their perspective, the whole galaxy is a sort of real-time distributed AI environment. The AIs essentially govern the galaxy, though always under the Network’s overarching rules, and even they are constrained by the Network’s original safety protocols.

Biological species like our hero (the last human) do not think at the speed required to interact directly with the Network. They must use implants. For most species, this implant is installed early in life. It acts as a gateway between slow biological minds and the high-speed realm of the Network. Every citizen has a brain implant (or equivalent biological connector) that provides identity (the user’s presence on the Network authenticated at the neurological level), sensory overlays, translation, messaging, augmented reality (AR) guidance, behavioral nudges, and safety protocols.

The implant also includes a local, personal companion AI that converses with its user. This AI filters incoming Network data so you aren’t overwhelmed, prioritizes messages, translates context into your species’ cognition, and prevents dangerous sensory overload. It’s a bit like a hyper-advanced Siri/Cortana. It helps you recall memories, schedule things, guide your movement through AR, and “nudges” you when you’re about to do something suboptimal. It also behaves like what we currently think of as agentic AI, in that you can ask it to perform a task, such as locating specific information, and it will dispatch AI agents to perform your bidding.

Now, this is where things get interesting (and where we make a desperate attempt to connect to the core of this column). In the Network, everything that’s sentient enough to make a decision counts as a “mind,” no matter how tiny. This includes things such as doors, toilets, trash chutes, cleaning bots, ventilation controllers, safety monitors, light switches, and minor repair drones.

These micro-intelligences are often Tier 0.1 to Tier 0.2, which is barely above instinct. Think of a simple reflex loop with a personality veneer. Most are roughly equivalent to a loyal dog, a Roomba, or a VERY obedient child. They have enough cognition to perform their function, detect anomalies, follow Network safety laws, recognize a being’s tier, communicate danger, and ask for help when overwhelmed, but not enough to have ambitions or complex emotions.

However, because the Network demands universal safety and predictability, even these micro-intelligences get a unique identity, a sliver of cognition, a connection to the Network, obligations to follow rules, and the ability to “complain” if they feel they’re being mistreated. All this results in a universe where everything is watching, reporting, and quietly thinking.

I can easily envisage living in a world surrounded by intelligent sensors. In many cases, I’d welcome it (I’m thinking about all the water leaks, HVAC issues, and humidity-in-crawlspace problems we’ve been fighting in our house recently).

I think we are heading toward such a world, one shuffling step at a time. It seems like every day we hear about advances in software space (where no one can hear you scream), such as large language models (LLMs), generative AI, and agentic AI. We also hear about developments on the hardware side, including incredibly low-power inference engines that can be deployed in battery- or ambient-powered nodes.

I’m thinking about technologies like BrainChip’s Akida, which is a digital, event-based, spiking neural network (SNN) neuromorphic processor (see Bodacious Buzz on the Brain-Boggling Neuromorphic Brain Chip Battlefront), and POLYN’s Neuromorphic Analog Signal Processing (NASP) technology, which is an analog non-spiking neuromorphic processor (see Analog Neuromorphic Processors for ASICs/SoCs Offer Microwatt Edge AI).

What we are talking about here is the ability to perform inference operations at the edge with only microwatts of power. These sensors and inference engines employ all sorts of tricks, like spending most of their time in sleep mode, waking only occasionally to take a reading and report back to “head office” via a local router.

This all sounds great if you speak quickly and gesticulate furiously. What I’ve never really considered is the power budget associated with the wireless communication component. A quick “back of the envelope” ponder reveals that if we assume an AI inference in the 10–100 microwatts (10^⁵ to 10^⁴ W) range, then a Bluetooth radio transmission of 10–50 milliwatts (10^² to 5*10^² W) consumes 1,000X to 5,000X more power than the on-sensor AI. Similarly, a Wi-Fi transmission of 200 to 1,000+ milliwatts (2*10^¹ to 1 W) consumes ~10,000 to 1,000,000 times more power than the on-sensor AI.

This power gap is ridiculous. It makes my eyes water just to think about it (I’m not crying; I just have something in my eye).

All of which leads us to the point of this column (yes, of course there’s a point; did you ever doubt me?). I was just chatting with Patricia Bower, VP of Product Management at HaiLa Technologies.

After the customary pleasantries (everyone seems to love my Hawaiian shirts), the first thing Patricia told me was, “We’re a radio semiconductor company.” The second thing she told me is that the real barrier to ubiquitous IoT isn’t sensors, processors, or even the AI—it’s the cost of pushing bits through the air. Although she didn’t use these exact words, the gist was that “Radios are the long pole in the IoT tent, and HaiLa’s mission is to shrink that pole down to the size of a cocktail toothpick.”

HailLa’s secret sauce is a clever combination of bi-static passive backscattering and ultra-low-power active transmit, applied not to exotic new protocols but to the existing global wireless workhorses: Bluetooth Low Energy (BLE) and Wi-Fi. These technologies are everywhere—phones, laptops, access points—so instead of reinventing the wireless stack, HaiLa found a way to “ride on top of it” while using almost no energy.

The best analogy—and the one HaiLa themselves use—is the laser-and-mirror story. Generating a laser beam is expensive in terms of power, but holding up a mirror and wiggling it to modulate the beam as it passes by is practically free. Traditional radios are the “lasers” in our analogy. They generate power-hungry RF waveforms. HaiLa’s devices behave more like “mirrors,” reflecting and subtly modulating existing Wi-Fi or BLE signals as they zip by.

This is the essence of passive backscatter: the HaiLa device doesn’t synthesize its own carrier; it simply changes the impedance of its antenna so that the reflected signal carries new information. It’s the same principle used in RFID, except HaiLa adapts it to mainstream protocols like Wi-Fi and BLE, which is a giant leap forward because RFID requires special readers, whereas Wi-Fi and BLE devices are already everywhere.

HaiLa passive backscattering on Wi-Fi (Source: HaiLa)

HaiLa’s flagship technique is bi-static passive backscattering on Wi-Fi. One Wi-Fi device transmits a “blank” frame, and another Wi-Fi device receives the reflected frame. The HaiLa-equipped sensor simply detects a particular waveform from the 802.11 spec (specifically the old 1–2 Mbps DSSS format that still exists in all modern access points), embeds sensor data into the frame, and reflects it toward a different receiver.

Because the reflected signal is extremely weak, HaiLa uses a clever 50MHz channel shift to move the backscattered frame away from the original downlink channel. This provides the receiver with sufficient adjacent-channel rejection to detect the whisper-quiet reflection. None of this requires HaiLa to generate its own RF carrier, which is why power consumption is microscopic.

With their current evaluation silicon, HaiLa can run a Wi-Fi-connected temperature-and-humidity sensor at ~50µW average power. Their production chip (taping out on GlobalFoundries’ 22FDX node) is expected to achieve 5–10µW average, which is low enough to run a sensor for 15–20 years on a single CR2032 battery cell, or even operate battery-free using harvested RF energy.

Although the backscattering technique is awesome, some applications—think security sensors or event-driven alerts—need the device to speak first, not wait for a poll. For these devices, HaiLa has developed an extremely low-power active transmit mode. This still uses their custom PHY and protocol optimizations but generates a real RF signal, enabling asynchronous uplink for mere microwatts rather than milliwatts.

HaiLa’s production chip includes a RISC-V MCU, a hardware crypto accelerator (AES-128/256) for secure payloads, and flexible interfaces (SPI, I²C, analog). They’re also contributing heavily to the next IEEE 802.11 task group for Ambient IoT, which will eventually make these capabilities native to future Wi-Fi standards.

When radios no longer dominate the power budget, we will finally be in a position to deploy billions of sensors that run for decades—or forever—without battery replacement. Smart buildings, wearables, medical patches, smart cars, asset tags, and entirely new sensing categories will become feasible.

HaiLa isn’t just shrinking power consumption; they’re shrinking the cost of connecting the physical world. I think I’m ready for my implant now. How about you?

Leave a Reply

featured blogs
Jan 29, 2026
Most of the materials you read and see about gyroscopic precession explain WHAT happens, not WHY it happens....

featured chalk talk

Unlocking Cost-Effective and Low-Power Edge AI Solutions
In this episode of Chalk Talk, Miguel Castro from STMicroelectronics and Amelia Dalton explore how you can jump-start the evaluation, prototyping, and design your next edge AI application. They also investigate the details of the cost-effective and lower power edge AI solutions from STMicroelectronics and how the tools, the ecosystem, and STMicroelectronics MCUs are enabling sophisticated AI inference right on the device.
Jan 15, 2026
21,257 views