feature article
Subscribe Now

What Would Spock Do?

Do the Needs of the Many Outweigh the Needs of the One?

You invent stuff. Your colleagues over there in the [fill in the blank] department invent stuff, too. Together, everyone in your company collaborates to invent stuff, which you then sell for money. Outside people seem willing to trade their money for your stuff. Everybody wins. 

That’s the third-grade-level, Little Golden Book version of market dynamics and capitalism. Most of us grok the concept pretty easily. But of course, the reality is never that simple.

What if you didn’t actually invent the stuff you’re selling? What if, say, you just dug it up out of the ground (i.e., you’re a miner)? Or found it lying on the street (a recycler)? Or stole it from somebody else (a thief or a fence)? Or, more germane to our industry, what if you saw an interesting new machine, liked how it worked, and then created your own version of it?

In the art world, that’s known derisively as a derivative work – something not wholly original. And yet, there are very definite “schools” of art that all look (to the untrained eye) like the same style, or even the same painter. To be honest, I can’t tell the difference between one velvet Elvis painting and another, although I’m sure the cognoscenti could explain the finer details. Architecture goes through stylistic phases; a colonial-style house from the late 1700s doesn’t look anything like a 1950s tract home. Are similar styles derivative or a sign of progress?

All of these questions and others went through my head as I watched a bad cyber-thriller movie. (Like, “has the director ever used a computer?”) In amidst all the faux technology and made-up McGuffins were a number of familiar technological tropes. For example, every object with a display screen, from cell phones to huge wall monitors, had a touch interface. Specifically, they all responded to the “pinch to zoom” gesture that’s so familiar today. And I wondered, who invented that gesture? Who decided that touching a screen with two fingers and then moving them apart (or closer together) should change the on-screen magnification? It’s a terrific innovation, in part because it’s completely intuitive. Once you’ve done it one time (or seen someone else do it) it comes naturally. So much so that we sometime “pinch” screens that don’t have touch interfaces. Like in the movie.

Is that guy rich now? Is he lounging on his own tropical island sipping umbrella drinks? Or is pinch-to-zoom one of those things that “just happened,” with no specific inventor? The engineer who invented lighthouse lenses (Augustin-Jean Fresnel), utterly changed maritime navigation, saved thousands of lives, and enriched shipping companies the world over, yet he received nothing for his efforts. Well, apart from having his name engraved on the side of the Eiffel Tower in 10-foot-tall letters. That’s pretty cool. 

But back to the pinch-to-zoom: what if that innovation had been patented, copyrighted, or trade-secreted in such a way that it couldn’t be used by anybody else? What if Apple, Samsung, HP, Nokia, and just about every other company in the world were locked out of that particular user-interface component? We’d have no pinch-to-zoom feature on any of our products.

Not a galactic-scale tragedy, to be sure, but a bummer nonetheless. That’s a handy feature, and I’m glad that it’s ubiquitous. Just like I’m glad that all cars have the same three pedals, steering wheel, and dashboard instruments. Or that QWERTY keyboards, not to mention piano keyboards, are standardized. Did the QWERTY guy get rich? Does some Medieval musician deserve the credit for claviers, harpsichords, and pianos? Life would have a bit less sparkle if those all had different user interfaces. 

But if those innovations had inventors, isn’t it up to the inventors to decide how they’re used – or not? Does inventing something (or finding it in the ground, lying on the street, etc.) grant you the privilege – nay, the right – to determine its ultimate use? If you find a diamond in the dust, you’re under no obligation to sell it, or even to tell anyone where you found it. If you discover a gold mine, you don’t have to share its riches (not in most countries, anyway). And if you code an efficient new network protocol stack, you’re under no obligation to use it, publish it, sell it, or open-source it. It’s yours, to do with as you please.

Yet there’s the contrary goal of the public good. If a semiconductor company is acquired by another company, which then kills off the former company’s products and deep-sixes its technology, they’ve arguably damaged the industry as a whole. At the very least, they’ve probably annoyed several hundred engineers who used to use that company’s products. What’s the ethical obligation to keep a product alive and available on the market? Not very much; we see old products killed off all the time. That’s natural product evolution, and it’s especially prevalent in our industry. But what if you kill off a product just to remove it from the marketplace? The old saying that you can’t have two bulls in a same pasture sometimes applies to tech products, where vendors will kill off one perfectly good product to protect sales of another. This is usually the after-effect of a merger or an acquisition, but it can also come about naturally with enough poor planning.

So let’s say you’re in charge of your company’s “innovation monetization.” You control the patent portfolio, the trademarks, the intellectual property, and (not incidentally) the engineers’ compensation and overall morale. If someone in your group develops, discovers, or invents something useful, what do you do with it? In most firms, such innovations automatically become the property of the company, not the inventor(s), so this is a legitimate question. Somebody really does have to decide how, or whether, to commercialize a new development, whether or not to keep it under wraps, and whether or not to compensate the engineers who brought it to light. You also have the tricky problem of determining if it’s actually original, or if your engineers just happened to see a competitor’s demo at a trade show and decided to copy it. A “clean room” copy might be okay, or it might run afoul of IP laws in some far-off land. Ready, set, go.

And if you’re an independent inventor toiling away in a garage somewhere, do you necessarily commercialize your new gizmo, protocol stack, or GUI innovation? Or do you sell/license the technology to someone else – and, if so, how much do you charge for it? How do you balance the public good (if any) against your personal gain? Do the needs of the many outweigh the needs of the few… or the one? What would Spock do? 

Leave a Reply

featured blogs
Mar 18, 2024
If you've already seen Vivarium, or if you watch it as a result of reading this blog, I'd love to hear what you think about it....
Mar 18, 2024
Innovation in the AI and supercomputing domains is proceeding at a rapid pace, with each new advancement heralding a future more tightly interwoven with the threads of intelligence and computation. Cadence, with the release of its Millennium Platform, co-optimized with NVIDIA...
Mar 18, 2024
Cloud-based EDA tools are critical to accelerating AI chip design and verification; see how NeuReality leveraged cloud-based chip emulation for their 7NR1 NAPU.The post NeuReality Accelerates 7nm AI Chip Tape-Out with Cloud-Based Emulation appeared first on Chip Design....

featured video

We are Altera. We are for the innovators.

Sponsored by Intel

Today we embark on an exciting journey as we transition to Altera, an Intel Company. In a world of endless opportunities and challenges, we are here to provide the flexibility needed by our ecosystem of customers and partners to pioneer and accelerate innovation. As we leap into the future, we are committed to providing easy-to-design and deploy leadership programmable solutions to innovators to unlock extraordinary possibilities for everyone on the planet.

To learn more about Altera visit: http://intel.com/altera

featured paper

Reduce 3D IC design complexity with early package assembly verification

Sponsored by Siemens Digital Industries Software

Uncover the unique challenges, along with the latest Calibre verification solutions, for 3D IC design in this new technical paper. As 2.5D and 3D ICs redefine the possibilities of semiconductor design, discover how Siemens is leading the way in verifying complex multi-dimensional systems, while shifting verification left to do so earlier in the design process.

Click here to read more

featured chalk talk

Enabling the Evolution of E-mobility for Your Applications
The next generation of electric vehicles, including trucks, buses, construction and recreational vehicles will need connectivity solutions that are modular, scalable, high performance, and can operate in harsh environments. In this episode of Chalk Talk, Amelia Dalton and Daniel Domke from TE Connectivity examine design considerations for next generation e-mobility applications and the benefits that TE Connectivity’s PowerTube HVP-HD Connector Series bring to these designs.
Feb 28, 2024
2,727 views