feature article
Subscribe Now

Which Wireless Will Win?

The Internet of Things is the Driver

Imagine if our phones and gadgets all had wires for communication. Yeah… almost kills their usefulness. Now imagine a future sensor-saturated world, with all of them communicating by wire. Yup… we’d pretty much be crawling through a cat’s-cradle tangle to get anywhere.

We think of wireless as convenient, and it is. Why, with wireless mouse and keyboard, the rat’s nest behind my desk has shrunk. And then came a wireless printer connection. And, somewhere along the way, wireless USB was supposed to happen, although those wires still seem to predominate. But sometime in the future, with added wireless display transfers, then pretty much the only cables necessary would be for power. Pretty sweet.

Looking further at the home network, power-line communication standards are one way of trying to get things talking to each other without a separate wire. But really, wireless appears to be the default approach – especially since many devices are designed for communication with a phone.

Once you move out of the home into the wide world of sensor networks, many of which may be in far-flung locales, then running wires stops being practical altogether.

Given that we’re moving into the age of wireless, the next obvious question is, which wireless?  There are numerous standards and protocols, and, just a year or two ago, I was scratching my head trying to find some rhyme and reason to what was ending up where and why. A discernable pattern is emerging, however, and it’s my guess that commonality will precipitate out, with Bluetooth and WiFi dominating that precipitate. For a while.

WiFi, of course, has an entrenched position as part of the ubiquitous IP-based network. Replacing wired Ethernet, it continues to boost its ability to transport lots of data quickly. But that comes with a cost – largely in power consumption and complexity, and smart home devices have tended to use alternatives. Up until more recently, you’d see various Zigbee, basic Bluetooth, and proprietary formats predominating.

Those latter proprietary protocols serve the companies making the equipment by locking users into a system that limits their ability to buy from other vendors. Good for the equipment maker, but, given a choice, not likely to be preferred by users unless there’s a real differentiating advantage – like ease of use for the unsophisticated user.

But with the Internet of Things (IoT) blasting into the open and smart home gadgets forming an important part of that, improved performance and interchangeability will be important in order for equipment makers to compete. Users will have and want choices, and those choices will eventually, if they don’t already, provide features equal to or better than those provided by the proprietary folks. Just like AOL was successful early on, but faded as garden-walling provided less value, so proprietary wireless will have reduced value over time.

Meanwhile, I’ve noticed that the word “Zigbee” seems less frequent in my press release inbox than it used to be. It’s still there, for instance, with Greenvity announcing IoT SoCs that support both HomePlug and Zigbee communications.

But as the Zigbee mentions have seemingly tailed off a bit, something else has been blossoming: Bluetooth. And, in particular, Bluetooth Low Energy (BT-LE or BLE).

Honestly? BT-LE seems to be everywhere. Its relationship to other BT standards can be a little confusing, since BT version 4.0 was dubbed “Bluetooth Smart,” and it’s an umbrella for different modes of BT. There’s the “standard” BR/EDR (Basic Rate/Enhanced Data Rate) version, which has a direct genetic link to the original BT that converted perfectly respectable-looking suits on the streets into crazy people talking to themselves. And there’s BT-LE. Both part of Bluetooth Smart.

While BR/EDR represents BT evolution, BT-LE is a different beast, created specifically to achieve lower power with a different use model. BT originally excelled at delivering point-to-point streaming data like audio; BT-LE, by contrast, is intended for bursty data transfers. And key to saving power in such a scheme is using the radio only when transmitting. Fast wake-up and shut-down are a part of that consideration, and BT-LE can hop out of bed and start work in about 3 ms, as compared to about 30 ms for BR/EDR and hundreds of ms for WiFi.

In fact, because the two main BT modes are for two very different purposes, dual-mode chips have been designed that allow both channels to be used at the same time. These have already made their way into phones – something that Zigbee hasn’t achieved, making BT an easier choice for equipment makers that want to communicate with phones.

WiFi still has legs, although, as one company, Redpine, describes it, WiFi has value when you need a more robust, secure connection or when you want an easier time hopping onto the internet. It also has a longer range. BT-LE provides simpler, lower-power, local connectivity.

They’ve used this distinction to create a different dual-mode approach: they make real-time locating system (RTLS) tags that support both WiFi and BT-LE – and they can power them with photovoltaic (PV) cells. The challenge they faced was getting the WiFi part implemented with power low enough to sustain always-on operation while being fed only by the PV cells. It wasn’t so much the power management circuitry on the chip as it was the firmware and the wireless circuits themselves that provided the biggest challenge.

These tags form a part of their proprietary location-based services (LBS) architecture. In particular, they use BT-LE for determining location within 1 m under harsh conditions (they say that others can achieve that precision only under favorable conditions). They use the signal arrival delay and angle of arrival to establish the position.

Now… despite this sense that BT-LE and WiFi have things sewn up, there still appears to be demand for alternatives. That’s because, between the two, you have close-in and somewhat distant covered, but you don’t have really distant covered well (something Zigbee does), and power is always an issue. So there are people working on new approaches (both wireless and wired).

Flutter is one of those new approaches. It was done by a robotics engineer who found Zigbee too hard to use and integrate for small-time designers. And he didn’t need super high speed: that’s very typical of sensor networks – you’re going to transmit periodic data, and not a lot of it. So enormous bandwidth isn’t required, which opens the door to lower power.

And for more remote installations, distance is a consideration. The Flutter folks have managed a kilometer, helped by their lower speed and frequency: 1.2 Mb/s transmitted at 915 MHz. They’re evaluating a new radio chip that they think can extend their range beyond 1 km.

Flutter is currently an Arduino board project that’s looking for Kickstarter money to fill out the software protocol. It will be open-sourced for both hardware and software.

And there’s one more idea floating out there that’s even more radical. But that’s a story for another day.

So even though Bluetooth and WiFi seem to be surviving the shakeout, they may not meet all the needs that are out there, and new approaches are still being proffered. Looks like we may have a few more generations of thrash before things settle down. (If they ever do.)


More info:







8 thoughts on “Which Wireless Will Win?”

  1. Bryon,
    Let’s reason it out.
    When Betamax and VHS slugged it out, VHS won. It wasn’t as technically elegant, but nobody wanted to sign up to Sony’s agenda, so VHS died.
    In the WiMAX vs LTE fight, LTE won despite the lower bandwidth cost of WiMAX. LTE was simply too widespread and was thus too convenient.
    There was also a fight between USB and 1394 many moons ago, when I still had a full head of hair. USB, despite it’s I/O and protocol difficulties, seemed to be simpler than IEEE 1394 and thus long term less costly. So it won.
    What I’m getting at is this:
    The best horse in the race rarely wins in High Tech. It’s the cheapest, most convenient horse that isn’t owned by one company.
    Which of the wireless protocols that could interact with a vast network of sensors fits that profile the best?

  2. Pingback: 123movies
  3. Pingback: videos
  4. Pingback: DMPK

Leave a Reply

featured blogs
Jun 22, 2018
A myriad of mechanical and electrical specifications must be considered when selecting the best connector system for your design. An incomplete, first-pass list of considerations include the type of termination, available footprint space, processing and operating temperature...
Jun 22, 2018
You can't finish the board before the schematic, but you want it done pretty much right away, before marketing changes their minds again!...
Jun 22, 2018
Last time I worked for Cadence in the early 2000s, Adriaan Ligtenberg ran methodology services and, in particular, something we called Virtual CAD. The idea of Virtual CAD was to allow companies to outsource their CAD group to Cadence. In effect, we would be the CAD group for...
Jun 7, 2018
If integrating an embedded FPGA (eFPGA) into your ASIC or SoC design strikes you as odd, it shouldn'€™t. ICs have been absorbing almost every component on a circuit board for decades, starting with transistors, resistors, and capacitors '€” then progressing to gates, ALUs...
May 24, 2018
Amazon has apparently had an Echo hiccup of the sort that would give customers bad dreams. It sent a random conversation to a random contact. A couple had installed numerous Alexa-enabled devices in the home. At some point, they had a conversation '€“ as couples are wont to...