feature article
Subscribe Now

Conspicuous Consumerism

CES Takes Over Vegas

The Consumer Electronics Show (CES) debuted in 1967 – shortly, it turns out, after Gordon Moore made the prognostication that has come to be known as “Moore’s Law.”  This week, in Las Vegas, CES is celebrating its 40th anniversary.  So how is the old show doing?  In those 40 years, the number of exhibitors has risen from 110 to over 2,700, and the show space has grown by over 11X.  The number of attendees has mushroomed to well over 100,000 (estimates are around 140,000 for this year’s show).  It proudly proclaims that it is the world’s largest consumer technology show.

Is this growth impressive?  Hardly.  If the show had performed as well as the companies that attend it – just keeping pace with Moore’s law — our estimates are that CES should have been somewhere over a billion attendees this year, and the exhibit space should have been well in excess of 5,000 square miles.  Would a 70 mile by 70 mile exhibit space filled with approximately four times the population of the United States be too much to expect?

If you’re attending CES this week, however, you might easily believe it was that big. And ironically, if you compared my 1967 24” Zenith color TV set with my 2006 50” plasma – you might never guess that the latter benefits from more than a million times the transistors of the former.  Why is it that we in the electronics engineering field deliver a bona-fide, measurable, million times improvement in our product over a four-decade span, and our end customer notices that the picture is a little bigger and sharper, while the tradeshow industry squeezes out a paltry 11x growth over the same time period and it feels a million times larger?

The answer is that these results are based on human perception, and we as engineers generally fail to account for perception when we’re creating our masterpieces.  Here at this celebration of electronics created for the benefit of the general public, a number of trends are in evidence.  One of these, however, is the incredible challenge and frequent failure of product engineering to fully account for the human experience.

Case in point:  I assert that, despite decades of technology advancement, today’s electronic products have actually gotten more difficult to use.  We design with the best of intentions, of course.  Rare is the consumer electronics product that fails to deliver an installation CD, a “Read me First” document, Macintosh and Windows instructions in 17 languages, an errata sheet, release notes, USB and firewire cables, a power supply compatible with nearly every civilized country’s utility system, a detailed listing of all the imaginable ways the product could be used to injure oneself, others, or the environment, a protective plastic bag clearly marked with warnings about suffocating small children, a sticker with lessons on the hazards of static electricity, and an on/off switch marked with an international icon representing the concept of power application rather than those cryptic “ON” and “OFF” labels that carelessly assumed the consumer was literate.

With all these measures in place, how is it that we professional electronics engineers are so frequently called into service to deliver technical support to our parents – those often university-educated, usually intelligent, reasonable adults that brought us into this world and raised us to maturity – to solve complex issues like “How do I check the voicemail on my cell phone?”  These technological wonders, designed by us, that are integrated almost by necessity into the very fabric of our parents’ lives, are often more confusing to operate than their predecessors.  Usually, the culprit is softness or, more specifically, modality.

In the good old days, controls were “hard”.  Every function of a device generally had a corresponding button, switch, or knob, and that control always did the same thing.  Today, however, with an embedded computing system hiding behind every faceplate, we’ve introduced a level of abstraction that a large segment of the population has yet to absorb.  In our attempts to make our devices more intelligent – concealing their inner complexity – we may have accidentally lost many of our users by introducing a layer of unwelcome obscurity. 

The discrepancy between perception and technical reality doesn’t stop at the user interface level, either.  As I walked the CES floor yesterday, I saw “1080p” printed prominently on literally hundreds of booth signs.  The vendors demonstrating this technology tout it as the purists’ high-definition.  Most engineers could tell you the difference between the progressive 1080p and the interlaced 1080i without blinking an eye.  However, after walking around for hours watching split screens with 1080p on the right and standard definition or conventional DVD on the left, and after staring intently at the 720p and 1080p versions of Panasonic’s plasma monitors, I’m convinced that few people on the planet could walk into a room and reliably tell you whether the TV was showing 720p, 1080i, or 1080p.  Even though the latter is considerably more technically challenging to deliver, the experiential benefits for the consumer are tiny at best.

Sure, I’ve seen the official demo.  With a frequency sweep or a moving solid object on a contrasting background, the motion artifacts of interlacing can actually be seen by a human eye – mine even.  However, even with a large amount of money on the table, I am not convinced that I could stare at, say, five screens playing identical programming and tell you which two were 1080p and which three were 1080i.  As an industry, our ability to measure and engineer often exceeds our customers’ ability to discern and enjoy.

On the embedded front, my initial scan through CES shows that embedded computing systems are being expected to deliver more and more of the features that were only the purview of high-end desktop computers a couple of years ago.  Multi-core and multi-processor embedded systems are almost the rule rather than the exception in consumer products now, and full featured operating systems are permeating consumer applications.  More often than not, advanced hardware capabilities such as 3D graphics-rendering engines and high-bandwidth wireless communication are also part of the equation in delivering the converged embedded experience we’re busy developing to consumers.

While we integrate and converge all that capability, however, we should all keep our mothers in mind.  The smartest phones and the most intelligent devices won’t be useful if they can’t make the hardest connection of all – the connection with the flawed, perceptual humans that are trying to integrate them into their lifestyle.  The devices and systems that can solve that problem are the ones that will be with us for years rather than weeks.

Leave a Reply

featured blogs
Mar 28, 2024
'Move fast and break things,' a motto coined by Mark Zuckerberg, captures the ethos of Silicon Valley where creative disruption remakes the world through the invention of new technologies. From social media to autonomous cars, to generative AI, the disruptions have reverberat...
Mar 26, 2024
Learn how GPU acceleration impacts digital chip design implementation, expanding beyond chip simulation to fulfill compute demands of the RTL-to-GDSII process.The post Can GPUs Accelerate Digital Design Implementation? appeared first on Chip Design....
Mar 21, 2024
The awesome thing about these machines is that you are limited only by your imagination, and I've got a GREAT imagination....

featured video

We are Altera. We are for the innovators.

Sponsored by Intel

Today we embark on an exciting journey as we transition to Altera, an Intel Company. In a world of endless opportunities and challenges, we are here to provide the flexibility needed by our ecosystem of customers and partners to pioneer and accelerate innovation. As we leap into the future, we are committed to providing easy-to-design and deploy leadership programmable solutions to innovators to unlock extraordinary possibilities for everyone on the planet.

To learn more about Altera visit: http://intel.com/altera

featured chalk talk

Shift Left with Calibre
In this episode of Chalk Talk, Amelia Dalton and David Abercrombie from Siemens investigate the details of Calibre’s shift-left strategy. They take a closer look at how the tools and techniques in this design tool suite can help reduce signoff iterations and time to tapeout while also increasing design quality.
Nov 27, 2023
16,421 views