feature article
Subscribe Now

Deep and Wide

The Enginnering Tide

We engineers are unusually comfortable with periodicity.  We find ourselves fooling around with frequency domain from the first days of our undergraduate education, and by the time we become practicing professionals, we whip in and out of Fourier’s follies with the facility of wild monkeys traversing the forest canopy.  We eat, drink, and breathe periodic waveforms.  We handle harmonics, passbands, s-planes, and corners with reckless abandon.  We own the spectrum.

When it comes to our own careers, however, some of us switch to DC psychology almost immediately. We paradoxically refuse to acknowledge that technologies, markets, companies, and the economy all exhibit complex periodic behaviors that affect our jobs, our areas of expertise, and our successes and failures.  If we applied our understanding of our craft to our career, we might save ourselves scads of sleepless nights, angry rants at “the man,” and hopeless plunges into the abyss of romanticized obsolescence.  

Take, for example, the adoption cycles of new technologies within our own tool bag.  A few years ago, FPGAs came along, and a small segment of the engineering population cheered (quietly).  The new technology had advantages so compelling for their applications that they were willing to dive in on the deep end.  These engineers immersed themselves in the LUTs, bitstreams, configuration logic, and primitive tools that were part of that primordial programmable logic landscape and came out as the “early experts.”  These EEs knew the ways of the FPGA world, and they carried that knowledge to new career heights as their perceived professional value paralleled the marketability of their newfound expertise.  (CLOSED CAPTIONING FOR THE CAREER IMPAIRED:  THIS IS WHERE THE PROBLEM BEGINS FOR MANY ENGINEERS.  WE WORK HARD AT SOMETHING NEW, ENJOY A SUCCESS, THEN EMOTIONALLY CONNECT OURSELVES TO THAT TECHNOLOGY.) 

Unfortunately, our engineering self-esteem attaches itself to the wrong thing.  We believe we are valuable because of a particular transient technology, like “FPGAs,” not because we are great problem solvers who are quick to recognize a key technology and adept at learning the new skills required to harness it for truth, justice, and the betterment of all mankind.  (Oh, sorry, got carried away there. Let’s put those Lycra super-engineer tights back into the duffle bag for the time being.)

Why is this a problem? — Periodicity.

As the new technology matures and is more widely adopted, tools and methodologies improve. Gradually, our super-exclusive sandbox becomes populated with the B, C, and even D students.  We catch our teenagers using FPGA boards for science fair projects.  Wal-Mart starts stocking development kits.  Our hard-earned, deep-technology expertise becomes diluted, and we have to work to distinguish ourselves from the dime-a-dozen pretenders that flock to fill our former shoes.  Can these kids do the kind of design that we’re capable of?  Of course not.  They scrap together some VHDL fragments they found online into some sort of Franken-design, press the “GO” button on their free tool suite, and squeal with delight when their dev board blinks some LEDs on cue.  Never mind that they used six times the required amount of logic and that their design wouldn’t operate at more than 20% of the performance the technology could handle.  They blast on in blissful ignorance while we laugh up our sleeves.  

The next phase is even worse.  As our beloved technology permeates the masses, its reach broadens, and new, unfamiliar design domains encroach.  For FPGA-o-philes, it used to be enough to know your favorite HDL inside-out and to have the actual behavior of all 753 runtime options of your synthesis tool (including 39 undocumented ones) committed to memory.  Now, suddenly, people want to drop a processor core on an FPGA, boot an operating system, and de-bug an embedded application – all running on top of a hypervisor.  This is like allowing polo ponies in a football game.  The old rules are out, and the breadth of expertise required to operate in the new reality is mind-boggling.  We “deep” engineers give way to the “wide” ones.  Our employers welcome the crew that couldn’t care less about optimal LUT utilization, but can boot Linux on a MicroBlaze using memory management while setting up partial reconfiguration of an SDR modem – making sure the application software can connect with the hardware acceleration bits running in FPGA fabric.  These messy, “Mad Max” masters don’t exercise the engineering discipline of the purist.  They Rube Goldberg some apples, oranges, rusty nails, bailing wire, and duct tape into something useful in less time than a traditional “deep” engineer requires to prepare for the first formal specification review.

The old-school purist who led the charge into the new technology has two options at this point:  Fade away into self-pity and obscurity, or mount a new offensive on the next turning of the tide.  It is a rare individual who has the personality to reincarnate himself or herself as both “deep” and “wide” during a single career.  Our psyches just don’t seem to be wired that way.  

For engineering managers, recognizing the relevant skills and capabilities of both “deep” and “wide” engineers and constructing a team to take advantage of the strengths of both can be a huge advantage. Teams that treat engineering talent as plug-and-play, or “hot swappable,” are leaving resources on the table far more than even the clueless neophytes casually dipping their toes in the FPGA technology pond. Huge productivity benefits are available when management recognizes, cultivates, and utilizes the individual strengths and talents of each engineering personality type, and when those engineers themselves recognize the true value of their problem-solving expertise rather than hitching their wagons to transient technologies that will be washed away in the sands of commodity during the next turning of the engineering tide.

An engineering degree is, above all else, a license to learn. Our education and experience infuse us with the skills to understand difficult problems and to develop tools and methodologies that will allow others to manage those solutions in a scalable manner – while we move on to new, unexplored territory.  As long as we remember this, our professional futures and happiness are all but assured.

Leave a Reply

featured blogs
Nov 24, 2020
In our last Knowledge Booster Blog , we introduced you to some tips and tricks for the optimal use of the Virtuoso ADE Product Suite . W e are now happy to present you with some further news from our... [[ Click on the title to access the full blog on the Cadence Community s...
Nov 23, 2020
It'€™s been a long time since I performed Karnaugh map minimizations by hand. As a result, on my first pass, I missed a couple of obvious optimizations....
Nov 23, 2020
Readers of the Samtec blog know we are always talking about next-gen speed. Current channels rates are running at 56 Gbps PAM4. However, system designers are starting to look at 112 Gbps PAM4 data rates. Intuition would say that bleeding edge data rates like 112 Gbps PAM4 onl...
Nov 20, 2020
[From the last episode: We looked at neuromorphic machine learning, which is intended to act more like the brain does.] Our last topic to cover on learning (ML) is about training. We talked about supervised learning, which means we'€™re training a model based on a bunch of ...

featured video

Improve SoC-Level Verification Efficiency by Up to 10X

Sponsored by Cadence Design Systems

Chip-level testbench creation, multi-IP and CPU traffic generation, performance bottleneck identification, and data and cache-coherency verification all lack automation. The effort required to complete these tasks is error prone and time consuming. Discover how the Cadence® System VIP tool suite works seamlessly with its simulation, emulation, and prototyping engines to automate chip-level verification and improve efficiency by ten times over existing manual processes.

Click here for more information about System VIP

featured paper

How semiconductor technologies have impacted modern telehealth solutions

Sponsored by Texas Instruments

Innovate technologies have given the general population unprecedented access to healthcare tools for self-monitoring and remote treatment. This paper dives into some of the newer developments of semiconductor technologies that have significantly contributed to the telehealth industry, along with design requirements for both hospital and home environment applications.

Click here to download the whitepaper

featured chalk talk

RF Interconnect for 12G-SDI Broadcast Applications

Sponsored by Mouser Electronics and Amphenol RF

Today’s 4K and emerging 8K video standards require an enormous amount of bandwidth. And, with all that bandwidth, there are new demands on our interconnects. In this episode of Chalk Talk, Amelia Dalton chats with Mike Comer and Ron Orban of Amphenol RF about the evolution of broadcast technology and the latest interconnect solutions that are required to meet these new demands.

Click here for more information about Amphenol RF Adapters & Cable Assemblies for Broadcast