What if it Happened Again?
We sit here in our dazed, progress-drunk technology buzz looking back at the half-century rocket ride that transformed not only our industry and engineering profession, but also all of modern civilization. Nothing in recorded history has had as much impact on the world as Moore’s Law. It has re-shaped global culture, dramatically altered politics, and even affected fundamental aspects of the ways human beings work, think, feel, and relate to each other. If this weren’t the single biggest change driver in the history of civilization, it was right up there with democracy, monotheism, combining caramel and chocolate, and some other really heavy-hitters. Innovation in electronics has spilled over into just about every other aspect of our collective lives, and the change is profound.
But, what if it happened again - not in electronics this time, but somewhere else?
To answer that question, we should look at what caused Moore’s Law in the first place. It was a single innovation, really. Just one idea.
Are FPGAs Harbingers of a New Era?
The title may have put you off. In fact, it probably should have. After all, most of us in the press/analyst community have - at one time or another during the past decade or two - been walking around like idiots wearing sandwich signs saying, “The End is Nigh!” And, we got just about as much attention as we deserved. “Yawn, very interesting, press and analysts, and now back to planning the next process node…”
It gets worse. Predicting that Moore’s Law will end is pretty much a no-brainer. It’s about as controversial as predicting that a person will die… someday. There is obviously some point at which the laws of physics and the reality of economics will no longer allow us to double the amount of stuff we put on a single chip every two years. The question is - when will we reach that point, and how will we know we are there?
Lattice's iCE40 Ultra and Xilinx at
Bienvenido a Fish Fry! Welcome to this week’s field programmable Fish Frying festivities. First to join the podcastin' party is Joy Wrigley from Lattice Semiconductor. Joy and I discuss how FPGAs are breaking into the IoT scene and why low power will make all the difference in tomorrow’s mobile designs. Joining the fun next is Barrie Mullins from Xilinx. Barrie and I chat about how Vivado is playing a bigger role in this year's X-fest and why this conference isn't just for FPGA designers.
The Quest for Truth in Engineering
“Why don’t they just put solar cells on top of cars and power them that way?”
His tone implied that the engineers designing cars were just idiots, and he was sure he could do better - with just this one idea. I was going to answer with some helpful information about the amount of energy required to operate an automobile, the amount of energy collected by even idealized solar cells, and the amount of area available on top of a typical vehicle. I didn’t get the chance.
His friend interrupted, “Well that’s just the government shutting them down. The oil companies have the government in their pocket, and they’re not about to let anyone develop technology like that. It’s the same with that 200 MPG carburetor that guy in Florida invented…”
Now, I was more hesitant to speak. I wanted to explain that modern, sensor-driven, computer-controlled fuel injection systems did a much better job achieving near-ideal fuel-air mixtures than any carburetor could ever hope to accomplish. I didn’t get the chance.
Cadence Rolls New Protium Platform
System on Chip (SoC) design today is an incredibly complicated collaborative endeavor. By applying the label “System” to the chips we design, we enter a realm of complex, interdisciplinary interactions that span realms like analog, digital, communications, semiconductor process, and - with increasing dominance - software. Since the first SoCs rolled out a mere decade or so ago, the composition of design teams has shifted notably, with the percentage of cubicles occupied by software developers increasing much more rapidly than those of any of the other engineering disciplines. In most SoC projects today, software development is the critical path, and the other components of the project are merely speed bumps in the software development spiral.
Lattice iCE40 Ultra Brings Programmability to Wearables
In the 1960s, an electronic device was “cool” if it had the word “transistor” in it. Even though the general public didn’t understand the benefits a transistor brought to a portable radio, everyone wanted the new “transistor” type. Then, of course, the shock and awe of Moore’s Law took the world on a fifty-year joy ride that completely isolated the electronics-buying public from any hope of understanding or appreciating what went on inside the latest consumer technology wonders.
For that reason, the “FPGA” label probably won’t be applied to mobile and wearable products the same way “transistor” was a few decades back, but the role of the FPGA today is no less transformative and enabling than the transistor was in the 1960s. Using an FPGA in a consumer device - particularly a small, power-sensitive, portable or mobile one - raises the stakes in a way that most definitely deserves a title role.
Thwarting Pirates with AI and X-fest 2014
At Attention Ye Salty Dogs! Hoist the mizzen mast, Fish Fry is ready to set sail! This week Jim Beneke comes aboard our mighty Fish Fryin’ ship to plot a course to X-fest - the yearly how-to training sessions from Avnet and Xilinx for FPGA, DSP, SoC and embedded systems designers. Join us as we dig into X-fest’s treasure trove of deep tech seminars, trainings, and much more. Then, in keeping with our pirateous parade, we delve into the details of a new pirate-busting radarrrr called WatchStander, and check out how artificial intelligence plays an important role in WatchStander’s modus operandi.
The Career-Limiting Blame Trap
John had been working for almost six weeks on a single part of the design persistence code. He had made no discernable progress. His original estimate for the project had been “2-3 days.” As John’s manager, and as the engineering manager responsible for the timely delivery of our project, I needed to do something. I stopped by John’s office for a visit.
“They want us to use an OODB,” John stated flatly. “It won’t work.”
I was confused. I knew the project inside and out. Our team, ten software engineers including John, had worked out the plans together, in our own conference room, on our own white board. There had been no mention of an OODB at any point in that process. The planning documents - sketchy as they were - that we had jointly developed during those meetings - made no reference to any specific implementation details.
From Intelligent Design to Evolution in Engineering
I once had a colleague who defined a “system” as “the thing one level of complexity higher than you can understand.”
I always thought there was a bit of profound insight in that definition.
We humans have a finite capacity for understanding, and that capacity does not benefit from the bounty of Moore’s Law. Our brains don’t magically think twice as fast every two years, or have the capacity to consider exponentially more factors each decade. Our limited brains also don’t compensate well for the layer upon layer of encapsulated engineering accomplishments of our predecessors. Even comparatively simple examples of modern technology can be deceptively complex - perhaps more complex than a single human brain can fully comprehend.
Will FPGAs Take Over the Data Center?
At the Gigaom Structure 2014 event last week, Intel’s Diane Bryant announced that Intel is “integrating [Intel’s] industry-leading Xeon processor with a coherent FPGA in a single package, socket compatible to [the] standard Xeon E5 processor offerings.” Bryant continues, saying that the FPGA will provide Intel customers “a programmable, high performance coherent acceleration capability to turbo-charge their algorithms” and that industry benchmarks indicate that FPGA-based accelerators can deliver >10x performance gains, with an Intel-claimed 2x additional performance, thanks to a low-latency coherent interface between the FPGA and the processor.
If we did our math right, Intel is implying that an FPGA could boost the speed of a server-based application by somewhere in the range of 20x.