IoT Standards

Is the Time Right to Calm the Chaos?

by Bryon Moyer

OK, so, the Internet of Things (IoT) is hot and heavy, and everyone and their cuzzins are putting “IoT” on their press releases so that editors will read them. And some of those folks are actually developing technology for the IoT.

But, even within the sphere of legitimate development, there are notes of caution: there is a whiff of chaos in the system that’s a bit too strong for some folks. They want standards. To be sure, that’s not everyone. In particular, some companies enjoy having proprietary technology that locks customers out of using anything from any other company. And a big motivation for standards is to ensure inter-operability, which kills that locked-in thing. So not everyone likes standards.

 

On The Hunt: Part One

HLS and Sub-atomic Particle Jitter

by Amelia Dalton

Dateline: The 5th of September. Time: 2100 hours. We're on the hunt. No, we’re not hunting the mysterious Yeti, the Loch Ness monster, or heck even the ever-elusive EUV. This time, we're looking for some HLS. My guest this week is Mark Milligan from Calypto. Mark joins Fish Fry for the very first time to bring HLS into the light, into the world, and into the caring hands... of Google? Oh yes. Also this week, we delve into the deeply nerdy realm of sub-atomic particle jitter and investigate how the U.S. Department of Energy's Fermi National Accelerator Laboratory is hoping to solve an age-old existential question: How many dimensions do we really live in? (Spoiler alert: The space-time continuum may actually be a quantum system made up of countless tiny bits of information.)

 

¡Viva el Nodo de 28nm!

The Impact of the Longest Lived Semiconductor Node

by Bruce Kleinman, FSVadvisors

We’ve spilled a lot of ink on 14/16nm FinFET here at EE Journal. It is exciting stuff: bleeding-edge process technology and a fabulous new transistor structure; heck, it’s 3D without the glasses. There is no doubt that FinFET will be a difference maker in some high-profile products. As reality sets in, however, there is growing doubt with regards to the breadth of FinFET applicability. FinFET will make the transition from bleeding-edge to leading-edge, but it may not make the transition to mainstream anytime soon … perhaps never.

FinFETs certainly sound golden. Like most golden developments, there is a catch: FinFETs require more complex process technology than planar FETs, and, coupled with the mind-numbing 14/16nm geometries in play things get really, really complicated. We are talking quite literally about the most complex undertaking in the history of mankind, all hyperbole checked at the gate. And while that has tremendous cachet amongst us Silicon Valley types, unfortunately, it makes for expensive chips.

 

The Overachieving Middle Child

MIPS I6400 Introduces 64-bit to the Midrange

by Jim Turley

“Bifurcate” is a word you don’t get to use very often. Yet it’s a familiar concept in our industry. Mobile operating systems have bifurcated into the choice between Android and iOS. On the desktop, it’s Windows or MacOS. Verizon or AT&T. Home Depot or Lowes. ARM or x86.

In all of these cases, the big pie chart is pretty much equally divided between two major players, with a thin sliver of “other.” In the desktop environment, the “other” slice of the pie includes Linux: it’s there, but it’s not really used by normal people and doesn’t really compete on the same footing as Windows or MacOS. The mobile OS market includes BlackBerry and Windows Phone, among others, but in the big banquet of life they’re relegated to the kids’ table.

 

There’s a Processor in My FPGA!

Hey, There’s LUT Fabric in my SoC!

by Kevin Morris

The idea of processors and FPGAs working together is exactly as old as the idea of FPGAs. Perhaps older, in fact, because even the prehistoric pre-FPGA PLDs often showed up on CPU boards - palling up with the hot processors of the day (which boasted 8 full bits of bone-crushing capability - at speeds of over a megahertz!) Of course, those programmable devices were mostly doing “glue logic” work - connecting up things that weren’t easy to connect otherwise.

Since those early days, processors and programmable logic have enjoyed a long and romantic partnership - spending long lazy days gazing lovingly into each other’s IO ports, exchanging data (and some control signals as well), and enriching each other’s lives through mutual cooperation. The partnership was never equal, though. Processors got all the glamour and recognition. Debutante CPUs would burst onto the red carpet with wider words and faster clocks, and they’d barely give a nod to their loyal FPGA companions who worked silently in the shadows, doing all the dirty work.

 

Optimization Moves Up a Level

Mentor’s RealTime Designer Rises to RTL

by Bryon Moyer

There are a lot of reasons why we can create so much circuitry on a single piece of silicon. Obvious ones include hard work developing processes that make it theoretically doable. But someone still has to do the design. So if I had to pick one word to describe why we can do this, it would be “abstraction.” And that’s all about the tools.

In fact, my first job out of college came courtesy of abstraction. Prior to that, using programmable logic involved figuring out the behavior you wanted, establishing (and re-establishing) Boolean equations that described the desired behavior, optimizing and minimizing those equations manually, and then figuring out which individual fuses needed to be blown in order to implement those equations. From that fuse map, a programmer (the hardware kind) could configure a device, which you could then use to figure out… that it’s not working quite like you wanted, allowing you to throw the one-time-programmable device away and try again.


Login Required

In order to view this resource, you must log in to our site. Please sign in now.

If you don't already have an acount with us, registering is free and quick. Register now.

Sign In    Register