feature article
Subscribe Now

What Would Spock Do?

Do the Needs of the Many Outweigh the Needs of the One?

You invent stuff. Your colleagues over there in the [fill in the blank] department invent stuff, too. Together, everyone in your company collaborates to invent stuff, which you then sell for money. Outside people seem willing to trade their money for your stuff. Everybody wins. 

That’s the third-grade-level, Little Golden Book version of market dynamics and capitalism. Most of us grok the concept pretty easily. But of course, the reality is never that simple.

What if you didn’t actually invent the stuff you’re selling? What if, say, you just dug it up out of the ground (i.e., you’re a miner)? Or found it lying on the street (a recycler)? Or stole it from somebody else (a thief or a fence)? Or, more germane to our industry, what if you saw an interesting new machine, liked how it worked, and then created your own version of it?

In the art world, that’s known derisively as a derivative work – something not wholly original. And yet, there are very definite “schools” of art that all look (to the untrained eye) like the same style, or even the same painter. To be honest, I can’t tell the difference between one velvet Elvis painting and another, although I’m sure the cognoscenti could explain the finer details. Architecture goes through stylistic phases; a colonial-style house from the late 1700s doesn’t look anything like a 1950s tract home. Are similar styles derivative or a sign of progress?

All of these questions and others went through my head as I watched a bad cyber-thriller movie. (Like, “has the director ever used a computer?”) In amidst all the faux technology and made-up McGuffins were a number of familiar technological tropes. For example, every object with a display screen, from cell phones to huge wall monitors, had a touch interface. Specifically, they all responded to the “pinch to zoom” gesture that’s so familiar today. And I wondered, who invented that gesture? Who decided that touching a screen with two fingers and then moving them apart (or closer together) should change the on-screen magnification? It’s a terrific innovation, in part because it’s completely intuitive. Once you’ve done it one time (or seen someone else do it) it comes naturally. So much so that we sometime “pinch” screens that don’t have touch interfaces. Like in the movie.

Is that guy rich now? Is he lounging on his own tropical island sipping umbrella drinks? Or is pinch-to-zoom one of those things that “just happened,” with no specific inventor? The engineer who invented lighthouse lenses (Augustin-Jean Fresnel), utterly changed maritime navigation, saved thousands of lives, and enriched shipping companies the world over, yet he received nothing for his efforts. Well, apart from having his name engraved on the side of the Eiffel Tower in 10-foot-tall letters. That’s pretty cool. 

But back to the pinch-to-zoom: what if that innovation had been patented, copyrighted, or trade-secreted in such a way that it couldn’t be used by anybody else? What if Apple, Samsung, HP, Nokia, and just about every other company in the world were locked out of that particular user-interface component? We’d have no pinch-to-zoom feature on any of our products.

Not a galactic-scale tragedy, to be sure, but a bummer nonetheless. That’s a handy feature, and I’m glad that it’s ubiquitous. Just like I’m glad that all cars have the same three pedals, steering wheel, and dashboard instruments. Or that QWERTY keyboards, not to mention piano keyboards, are standardized. Did the QWERTY guy get rich? Does some Medieval musician deserve the credit for claviers, harpsichords, and pianos? Life would have a bit less sparkle if those all had different user interfaces. 

But if those innovations had inventors, isn’t it up to the inventors to decide how they’re used – or not? Does inventing something (or finding it in the ground, lying on the street, etc.) grant you the privilege – nay, the right – to determine its ultimate use? If you find a diamond in the dust, you’re under no obligation to sell it, or even to tell anyone where you found it. If you discover a gold mine, you don’t have to share its riches (not in most countries, anyway). And if you code an efficient new network protocol stack, you’re under no obligation to use it, publish it, sell it, or open-source it. It’s yours, to do with as you please.

Yet there’s the contrary goal of the public good. If a semiconductor company is acquired by another company, which then kills off the former company’s products and deep-sixes its technology, they’ve arguably damaged the industry as a whole. At the very least, they’ve probably annoyed several hundred engineers who used to use that company’s products. What’s the ethical obligation to keep a product alive and available on the market? Not very much; we see old products killed off all the time. That’s natural product evolution, and it’s especially prevalent in our industry. But what if you kill off a product just to remove it from the marketplace? The old saying that you can’t have two bulls in a same pasture sometimes applies to tech products, where vendors will kill off one perfectly good product to protect sales of another. This is usually the after-effect of a merger or an acquisition, but it can also come about naturally with enough poor planning.

So let’s say you’re in charge of your company’s “innovation monetization.” You control the patent portfolio, the trademarks, the intellectual property, and (not incidentally) the engineers’ compensation and overall morale. If someone in your group develops, discovers, or invents something useful, what do you do with it? In most firms, such innovations automatically become the property of the company, not the inventor(s), so this is a legitimate question. Somebody really does have to decide how, or whether, to commercialize a new development, whether or not to keep it under wraps, and whether or not to compensate the engineers who brought it to light. You also have the tricky problem of determining if it’s actually original, or if your engineers just happened to see a competitor’s demo at a trade show and decided to copy it. A “clean room” copy might be okay, or it might run afoul of IP laws in some far-off land. Ready, set, go.

And if you’re an independent inventor toiling away in a garage somewhere, do you necessarily commercialize your new gizmo, protocol stack, or GUI innovation? Or do you sell/license the technology to someone else – and, if so, how much do you charge for it? How do you balance the public good (if any) against your personal gain? Do the needs of the many outweigh the needs of the few… or the one? What would Spock do? 

Leave a Reply

featured blogs
May 20, 2022
I'm very happy with my new OMTech 40W CO2 laser engraver/cutter, but only because the folks from Makers Local 256 helped me get it up and running....
May 20, 2022
This week was the 11th Embedded Vision Summit. So that means the first one, back in 2011, was just a couple of years after what I regard as the watershed event in vision, the poster session (it... ...
May 19, 2022
Learn about the AI chip design breakthroughs and case studies discussed at SNUG Silicon Valley 2022, including autonomous PPA optimization using DSO.ai. The post Key Highlights from SNUG 2022: AI Is Fast Forwarding Chip Design appeared first on From Silicon To Software....
May 12, 2022
By Shelly Stalnaker Every year, the editors of Elektronik in Germany compile a list of the most interesting and innovative… ...

featured video

EdgeQ Creates Big Connections with a Small Chip

Sponsored by Cadence Design Systems

Find out how EdgeQ delivered the world’s first 5G base station on a chip using Cadence’s logic simulation, digital implementation, timing and power signoff, synthesis, and physical verification signoff tools.

Click here for more information

featured paper

5 common Hall-effect sensor myths

Sponsored by Texas Instruments

Hall-effect sensors can be used in a variety of automotive and industrial systems. Higher system performance requirements created the need for improved accuracy and more integration – extending the use of Hall-effect sensors. Read this article to learn about common Hall-effect sensor misconceptions and see how these sensors can be used in real-world applications.

Click to read more

featured chalk talk

Expanding SiliconMAX SLM to In-Field

Sponsored by Synopsys

In order to keep up with the rigorous pace of today’s electronic designs, we must have visibility into each step of our IC design lifecycle including debug, bring up and in-field operation. In this episode of Chalk Talk, Amelia Dalton chats with Steve Pateras from Synopsys about in-field infrastructure for silicon lifecycle management, the role that edge analytics play when it comes to in-field optimization, and how cloud analytics, runtime agents and SiliconMAX sensor analytics can provide you more information than ever before for the lifecycle of your IC design.

Click here for more information about SiliconMAX Silicon Lifecycle Management