X Marks the Spot

Thwarting Pirates with AI and X-fest 2014

by Amelia Dalton

At Attention Ye Salty Dogs! Hoist the mizzen mast, Fish Fry is ready to set sail! This week Jim Beneke comes aboard our mighty Fish Fryin’ ship to plot a course to X-fest - the yearly how-to training sessions from Avnet and Xilinx for FPGA, DSP, SoC and embedded systems designers. Join us as we dig into X-fest’s treasure trove of deep tech seminars, trainings, and much more. Then, in keeping with our pirateous parade, we delve into the details of a new pirate-busting radarrrr called WatchStander, and check out how artificial intelligence plays an important role in WatchStander’s modus operandi.

 

Black Helicopters

The Career-Limiting Blame Trap

by Kevin Morris

John had been working for almost six weeks on a single part of the design persistence code. He had made no discernable progress. His original estimate for the project had been “2-3 days.” As John’s manager, and as the engineering manager responsible for the timely delivery of our project, I needed to do something. I stopped by John’s office for a visit.

“They want us to use an OODB,” John stated flatly. “It won’t work.”

I was confused. I knew the project inside and out. Our team, ten software engineers including John, had worked out the plans together, in our own conference room, on our own white board. There had been no mention of an OODB at any point in that process. The planning documents - sketchy as they were - that we had jointly developed during those meetings - made no reference to any specific implementation details.

 

Combatting Complexity

From Intelligent Design to Evolution in Engineering

by Kevin Morris

I once had a colleague who defined a “system” as “the thing one level of complexity higher than you can understand.”

I always thought there was a bit of profound insight in that definition.

We humans have a finite capacity for understanding, and that capacity does not benefit from the bounty of Moore’s Law. Our brains don’t magically think twice as fast every two years, or have the capacity to consider exponentially more factors each decade. Our limited brains also don’t compensate well for the layer upon layer of encapsulated engineering accomplishments of our predecessors. Even comparatively simple examples of modern technology can be deceptively complex - perhaps more complex than a single human brain can fully comprehend.

 

When Intel Buys Altera

Will FPGAs Take Over the Data Center?

by Kevin Morris

At the Gigaom Structure 2014 event last week, Intel’s Diane Bryant announced that Intel is “integrating [Intel’s] industry-leading Xeon processor with a coherent FPGA in a single package, socket compatible to [the] standard Xeon E5 processor offerings.” Bryant continues, saying that the FPGA will provide Intel customers “a programmable, high performance coherent acceleration capability to turbo-charge their algorithms” and that industry benchmarks indicate that FPGA-based accelerators can deliver >10x performance gains, with an Intel-claimed 2x additional performance, thanks to a low-latency coherent interface between the FPGA and the processor.

If we did our math right, Intel is implying that an FPGA could boost the speed of a server-based application by somewhere in the range of 20x.

 

BFFs 4 FPGAs

Maxim Debuts PIXI Programmable Analog

by Kevin Morris

He raised his binoculars and scanned the surface of the water. The wind was gusting - causing irregular dark patches of waves to move mysteriously across the surface. Whitecaps broke here and there. “It was a day just like this when it appeared,” he recalls somberly. “It raised up right over there, and I watched it for a good twenty seconds as it moved left to right - like nuthin’ I ever saw in my life. Then, it dropped back outa’ sight and I ain’t seen it since.”

OK, maybe programmable analog isn’t quite as elusive as legendary lake monsters, but it might be close. We see the occasional press release or hear a buzz at a conference that “programmable analog has finally arrived!” - but then the idea drops back into the darkness and disappears.

It turns out there’s a good reason for this recurring disappearing act. Programmable analog is hard.

 

Is the Classic Design Chain Broken?

Or Is It Just Another Step in Evolution?

by Dick Selwood

It used to be so simple. A group of chip designers would sit around drinking coffee and gently mulling things over when one would say, "You know what would be really cool? If we add a backward splurge feature to the K11 widget, it would allow users to do some awesome things."

After a bit of engineering discussion, the sales team would go off and chat to a few friendly customers, come back and say, "They aren't against it." After this, management would buy into the project. When the device was launched, the marketing team would make a lot of noise about user input and then the company would sit back and wait for orders. Sometimes they came, sometimes they didn't.

 

Stuck in (Net) Neutral(ity)

by Bruce Kleinman, FSVadvisors

This article is NOT about net neutrality. I’m hoping that opening declaration serves as a bit of a moat, as from what I’ve seen in the past few weeks any point of view on net neutrality is met with serious vitriol. I don’t need contempt in general, especially over something that I do for free. So while I stand by my many requests for comments, head elsewhere if you’re going to flame me.

If you are reading this article, odds are extremely good that you are on the Internet (given that EE Journal does not offer fax or mail service). And if you are on the Internet, odds are extremely good you’ve been reading about the Netflix-Comcast feud. Which from my perspective—and I am not being sarcastic at this point—is NOT about net neutrality.

In the interest of full disclosure, I am doubly biased: I am both a Netflix customer AND a Comcast customer. I’m not certain of the Socratic logic, but I’m going with “being biased on both sides of the feud cancels out” and leaves me unbiased.

 

Mass to Maker

Engineering the Deindustrial Revolution

by Kevin Morris

The Industrial Revolution was all about scalability. By developing efficient, scalable processes for manufacturing goods, engineers were able to create products for the masses more efficiently. Instead of treating each individual new automobile as a separate artisan project, assembly lines cranked out enormous numbers of identical cars with dramatically less (and less-skilled) manual labor. As a result of this more labor-efficient production, the cost of cars dropped, and automobiles became available to the masses. And so it went, from cars to candy to Converse, the flood of mass-produced commodities flowed across the landscape of civilization, forming rivers and lakes, carving gorges and valleys, and re-forming the very fabric of society.

The key element in the engineering of the industrial revolution was standardization. Epitomized by Henry Ford’s “A customer can have a car painted any colour that he wants so long as it is black,” We collectively took advantage of identical products, interchangeable parts, and rigid standards to reduce the amount of skill and labor required for production. Engineering is the art of compromise, however, and the big compromise in this strategy was customization. Mass production was antithetical to individuality. Human beings devolved into an indiscernible sea of grey as this loss of customization and choice robbed us of our personal preferences in product design.

 

Collision of Two Worlds

The FPGA Supercomputing Nexus of Hardware and Software

by Kevin Morris

We are always trying to make machines that think faster. Before we finish building computers that can solve the last generation of problems, our imagination expands and we have a whole new set of challenges that require a new level of computing power. Emerging applications like machine vision can seemingly consume all of the computing power we could possibly throw at them - and then some.

For the past couple of decades, a quiet but radical minority has seen FPGAs as a magic bullet in the quest for more computing power. However, the challenges of programming FPGAs for software-like tasks was daunting, and the inertial progress of von Neumann machines surfing the seemingly-eternal wave of Moore’s Law was sufficient to keep our appetites sated.

However, the monolithic von Neumann machine ran out of steam a few years ago.

 

Dawn of a new Ara

What’s Google’s New Modular Smartphone Really About?

by Kevin Morris

It would be easy to blow off Google’s “Project Ara” modular smartphone concept as just another one of those Google science fair projects that will never come to anything. Remember “Google Wave” anyone? Yeah, we were all “waving” bye-bye to that one before it ever got off the ground. As engineers, we all know that only one out of every dozen or more cool technology ideas ever comes to anything interesting. But, Google’s propensity for over-funding lots of blue-sky projects just to see if anything sticks is well known, and it is a fertile breeding ground for highly-public failures.

The marketing for Ara doesn’t help much either. Billed as the “smartphone for the next five billion people,” the marketing concept just doesn’t hold water. The implication is that all those people out there who currently don’t have a smartphone (you know the ones) have been just waiting around patiently until somebody gave them a phone they could customize with various application processors, wireless modules, memories, screens, cameras, and accelerometers. Hmmm… Just take a quick poll of the folks you know who don’t yet have a smartphone. Think it’s because they really really want the quad-core? Yeah, me neither.

subscribe to our fpga newsletter


Login Required

In order to view this resource, you must log in to our site. Please sign in now.

If you don't already have an acount with us, registering is free and quick. Register now.

Sign In    Register