Sep 03, 2015

What’s in an IoT Name? And Who Goes First?

posted by Bryon Moyer

iStock_000051490138_Small.jpgAt Imec’s ITF a couple months ago, the Internet of Things (IoT) loomed large, as it has a tendency to do everywhere. Seems to be the great unifying force, the collective raison d’être for us all.

But they had a different spin: the “intuitive” internet of things. It seems that everyone is trying to carve out their own version of the IoT, which is easily done with a concept that vague.

Imec also discussed public projects like the environmental sampling prototype in place in Eindhoven. And it occurred to me that this might be a back door into the consumer IoT space.

Let’s take those in order.

How to define the IoT? We recently saw the internet of moving things (I wasn’t buying it). Then there’s Cisco’s “internet of everything.” I’ve also had a hard time with “everything” being better than simply “things” because, obviously, the IoT will never include everything, so it feels like taking all the IoT hype (of which there is plenty already) and then dialing it to 11.

So what is this “intuitive” IoT concept? Imec’s idea here is that the Things themselves melt into the background. In the end, what we experience isn’t a gadget, but rather a service enabled by a gadget or a combination of gadgets.

One example from Imec’s Harmke de Groot is a smart kitchen, where recipes are suggested based on available ingredients, an in-hood sniffer can make suggestions on work in progress, and perishables can be monitored for freshness. These would be enabled by a host of sensors in refrigerators, cabinets, and on and around the stove itself.

This reminds me of past presentations on context awareness (a phrase that seems to have faded out of the headlines lately), where interfaces fade away and machines anticipate our needs. Nice idea in principle, although fraught with complication. Then again, we’ve solved complicated problems before. It certainly seems that the focus on what IoT devices enable is the right focus. Just not sure it needs a new name (especially IIoT, which looks like the Industrial IoT).

The Eindhoven environmental sampling project was held out as another example: as far as residents are concerned, there’s a service that tells them the local air quality. The fact that this service is brought to you by a bunch of devices hidden away somewhere is secondary. But this particular application also got me thinking about my usual background question about where the Consumer IoT profits will come from.

The essence of that question is that there is a ton of investment going into IoT technology – and presumably someone is expecting a return on that investment (RoI). In the consumer space, where will that come from? Are there compelling services that will make consumers either spend more to save somewhere else or simply spend more? Because if they don’t spend more than they do without the IoT, then it’s hard to see where the RoI is.

But projects like the environmental monitoring one bridge between industrial-looking and consumer-looking IoT. It has the scale of an industrial installation, but the beneficiaries of the services are consumers. Let’s call it the Municipal IoT. Things like street parking, street lighting, air quality. Municipalities are, to some degree, being forced into those applications. Just as with the Industrial IoT, there are efficiencies to be gained. Growth can be managed better, costs can be reduced, and quality of service can be improved. Unlike the Industrial IoT, however, this will be very visible to consumers.

So it got me wondering: will we ease into the Consumer IoT via the Municipal IoT, where exigencies force implementation without consumers having to buy off? This might defray a chunk of the fundamental technology investment, reducing the residual RoI needed from consumers via their in-home gadgets. It also gets consumers used to engaging with the services that the IoT enables, potentially stimulating demand for further, more personalized in-home services.

Speculation on my part. I think that segmenting the IoT in such a manner can have value for focusing investment for best return. Renaming the IoT, on the other hand, seems less beneficial.

Tags :    0 comments  
Sep 01, 2015

Blue-Collar Sensors from Microchip

posted by Bryon Moyer

In our coverage of sensors, we’ve seen increasing levels of abstraction as microcontrollers in or near the sensors handle the hard labor of extracting high-level information from low-level info. These are the hipster sensors that go on the wearables that go on your person for a month and then go on your nightstand.

Today, however, we’re going to get grittier and more obscure. Some sensors have more of a blue-collar feel to them, and I discussed two examples with Microchip back at Sensors Expo.

The first is a current sensor. Specifically, a “high-side” current sensor, meaning it goes in series with the upper power supply rail (not the ground rail). It can report current, voltage, or power. The unusual thing about this unit (the PAC1921) is that it provides both analog and digital outputs. “Why?” you may ask…

So much has moved to digital because, well, data can be provided in an orderly fashion, queried as needed by inquiring processors. FIFOs and advanced processing are available in the digital realm, and if you’re maintaining a history of power supply performance, digital is a great way to keep that tally.

Digital does, however, introduce latency. If you’re sensing the current and using the result in your power management algorithm, a bit of latency means that… oh, say, the voltage gets too high and you measure that and then digitize it and then put it someplace for a processor to find and then – oh, now look at that mess! Analog works much more quickly in a control loop. So here you get both.

Current_sensor_figure.jpg 

(Image courtesy Microchip)

Then, off to a completely different unit: a temperature sensor. Well, actually, not the sensor itself, but the wherewithal to calculate temperature from a thermocouple.

Apparently our penchant for integration and abstraction has lagged in this corner of the world. While thermocouples can generate a voltage based on the temperature, calculating the precise temperature based on that voltage has been a discrete affair (not to be confused with a discreet affair). It requires lots of analog circuitry to measure the microvolt signal (typically done at a “cold” junction, away from the actual heat), digitize it, and then perform the math.

That math reflects the fact that thermocouples have a non-linear relationship between their output voltage and the temperature. And the details vary by thermocouple type. So this calculation is typically done in an external microcontroller.

This would make the new MCP9600 the first device fully integrated with all the bits needed to convert volts (from the thermocouple) into degrees Celsius. They refer to it as a thermocouple-conditioning IC, and it works for a wide range of thermocouple types (K, J, T, N, S, E, B and R for those of you keeping score).

 Thermocouple_figure.jpg

(Image courtesy Microchip)

You can find more in their respective announcements: current sensor here, thermocouple here. We now return you to your white-collar sensors, which appear to have moved on from latte to white wine…

Tags :    0 comments  
Aug 28, 2015

Fear and Trepidation at Intel

posted by Bryon Moyer

iStock_000066481123_Small.jpgYou may be aware that Intel went through a layoff recently. Whatever you think of the merits of the layoff itself, it would be hard to argue that it was executed smoothly. But, to hear some tell it, the result has left something of a crater in the morale and confidence of at least some of the surviving workforce. And this is a group that has seen many layoffs in the past. So what was different here? Why has there been such unusual attention and even discussion of lawsuits?

I was able to get some idea of what happened and how it happened via input intended to be anonymous. For those of you wondering what all the fuss is about, it gives some color to what’s going on. To be sure, this is a one-sided story: Intel has steadfastly declined comment (well, except the CEO – more on that in a minute). I offered up a “fact check” of some of the critical points that follow; it was respectfully declined.

So, with that caveat, let’s start with the obvious: layoffs suck. They suck for everyone around them. I’ve personally experienced the whoosh of a near-miss as well as the direct blast myself. Whether it was me or the guy next to me, neither of us enjoyed it.

Silicon Valley has historically grown by leaps and layoffs. Humans are expensive assets, so at the slightest sign of a business sniffle, it can be tempting to offboard some of this burden. After the 2008 meltdown, it almost felt like some companies that were doing well still had to lay people off just so that they didn’t look to the shareholders like they weren’t minding the store. “Everyone else is laying off, so we need to also. The remaining people can just work harder.”

So layoffs are a well-established part of Silicon Valley culture; nobody likes them, but they happen, and we know that. There are also rules when it comes to layoffs: above a certain size and you have to make them public. And you need to give a certain amount of notice to the people you’re laying off. Neither of those happened; the layoff was made public by a leak. And Intel notified folks only 30 days ahead, so they compensated by 2 months extra severance (and the layees-off didn’t have to sign anything to get that).

But that’s just iffy execution; that’s not the main problem. The first, and biggest, main problem is the sense that the rules got changed for convenience (with suspicion of ageism – more on that in a mo). Nothing will rattle a workforce like finding out that the rules have changed – especially when it comes to compensation and employment.

To understand that, we need to dig into how compensation happens and how the layoff was implemented. Intel has a review process (“focal”) like any other company. Until recently, there were five categories: Outstanding, Exceeds expectations, Satisfactory, Below expectations, and Improvement required. This year, the “Below expectations” category disappeared: if you weren’t Satisfactory or better, then you were at the lowest level. A “low performer” is defined in their handbook as someone who got “below expectations” or “improvement required” 2 times out of the past 3 years.

Compensation apparently isn’t strictly tied to the review level, but obviously, consistency between the review and compensation make for a single, clear message. Good reviews and bad reward (or vice versa) make a confusing message. On the other hand, as any manager knows, employees at the top of their pay range as well as limited budgets can make it hard to put the money where the mouth is. It helps to have multiple tools for compensation.

Intel does have multiple tools. There are three components to the review cycle: the pay raise, a bonus target, and restricted shares. Options are given only to higher-level management; others get shares outright (the restrictions have to do with vesting and such).

And here’s where it gets complicated. Older employees are likely to be closer to the top of their pay range (simply by virtue of having gone through more review cycles). In addition, employees closer to retirement are less likely to benefit from long-term growth of stock. They’re getting into cash income territory – just like rebalancing portfolios from growth to income-earning investments.

So managers could, as a way of managing their budgets and allocating rewards, give their older employees more bonus cash and less in the way of stock. Younger employees might get the reverse. It wasn’t an official policy; it was at the discretion of each manager.

The point here is that employee expectations were set to believe that your measured reward contained three components, not just one.

So that’s how compensation is (or was) expected to work. Then came the layoff, and the criteria for being laid off were three:

  • Current or repeat low performer
  • Got an “improvement required” during the past year
  • Were low on their stock grants

Notice that last one: the stock allocation was used as a proxy for the entire review. In particular, folks that got bonus instead of stock weren’t recognized for the bonus and were categorized as poor performers. This is where the “changing the rules” sentiment comes from. Essentially, a three-legged stool had two of the legs cut off. Some managers were apparently able to argue on behalf of well-performing employees that had fallen afoul of the stock thing, but those were the exception, not the rule.

So the first problem here is a perception that the rules changed. But there’s a second, more subtle issue. Because there was a tendency to bias older worker compensation against stock and towards bonus, there is the sense that this was done to bias the layoff against older workers. This is part of the rattling of legal swords.

Which brings us to another rule change. In a June 18th informational session, three days after the layoff was announced, the people being laid off were told that, after a two-day cooling off period, they could apply as contractors. Normally the wait was 12 months, so this felt like a significant concession.

But when some folks tried to apply as contractors, they got pushback. Upon digging, it appears to be a nuanced thing. Intel cannot tell an outside consulting agency whom they can and cannot hire or place, but the business units can decide whom they want to accept, and they’ve barred those affected by this layoff. This hasn’t been widely communicated; only those that tried to apply as contractors are likely to be aware. So its contribution to any malaise would have been through rumor and internal blog.

So part 1 of why there is concern in the remaining workforce is the uncertainty of changing rules.

Part 2 was the fact that Intel at first tried to deny publicly that this was happening. That is, until the relevant memos got leaked to the press. That made it hard to keep things under wraps.

Which led to part 3: the CEO saying publicly that this was all about meritocracy. In so many words, he publicly announced that all of the laid-off engineers were bad employees and got what they deserved. This is not to say that all of those laid off were exemplary; there presumably were low performers in there. But the executive comments felt to me, as an outsider, as him pushing faces into the mud.

So, in review:

  • Layoffs suck. Always.
  • It’s worse if you’ve been playing by certain rules, and then those rules are changed to your disadvantage with no opportunity for appeal. That hurts credibility long-term.
  • It’s worse yet if your company publicly denies it’s doing what it’s doing. Another credibility hit.
  • And it’s worse yet when, forced to admit what’s happening, the CEO publicly denounces the people just laid off.

That, to my understanding, is why this layoff has caused so much commotion. I don’t claim to speak for all Intel employees, so feel free to comment below if you feel otherwise. (Or even if you agree…)

Tags :    0 comments  
Get this feed  

Login Required

In order to view this resource, you must log in to our site. Please sign in now.

If you don't already have an acount with us, registering is free and quick. Register now.

Sign In    Register