feature article
Subscribe Now

A Light at the End of the Tunnel

Photonics May Be the Next Big Thing

“Begone, Prince of Insufficient Light!” — Dogbert 

Imagine describing electronics, and electrons, to an 8-year-old. “Electrons are really tiny little particles that move through the wires, see? And they can turn switches on and off, and do stuff, and… uh… perform magic tricks.”

Every word of that description is inaccurate, but it kinda, sorta, gives newcomers the flavor of how electronics works. Like religion, it doesn’t have to be true to be helpful.

Now imagine describing photonics. To anyone. For most beginners, we simply replace the word “electron” with “photon” and everything else stays the same. They’re still really small particles (we think), they still flow through tiny wires (sort of), and they still perform magic. Remember what Arthur C. Clarke said about sufficiently advanced technologies…

You can’t get very far in either discussion without falling back onto analogies, metaphors, and frankly imaginary mental models. Photonics is just too foreign – too weird – to really grok in its entirety. Describing photonics in 2016 must be like describing electronics to someone in 1893.

That’s how I felt when speaking to Twan Korthorst, the CEO of Dutch-based Phoenix Software. Twan’s English is perfect, but he still had a hard time getting me to understand where the photonics business was heading. I’m pretty sure the fault wasn’t with him.

To many people, photonics falls into that broad category of “fringe technologies that are kind of like semiconductors but not really mainstream.” Think MEMS, but without the moving parts. The term covers several different technologies and usage models, from using light particles to switch logic gates to using lasers to drive long-haul transmissions. Like MEMS, the idea is to leverage existing CMOS fabrication technology to make photonics both easy to manufacture and cost-effective to do so. And, as with MEMS, the level of success has been mixed.

Back in the stupid-fast growth era of the 1990s when venture money rained out of the sky, photonics startups were sprouting up like weeds. Everyone thought light-based digital chips would be The Next Big Thing. That all stopped abruptly when the money and the markets dried up, and tens of thousands of photonics engineers had to go get real jobs.

Nowadays, commercial deployment of photonics is largely limited to cross-town and cross-country telecommunications. Long-haul data transport tends to value bandwidth over money. In other words, the world’s telecommunications corporations desperately want to increase the bandwidth of their pipes, never mind the cost. Lasers, fiber optics, and photonic chips neatly fill that market niche.

Pointing a big ol’ laser down a long light guide is one thing, but what telecom giants really want is thousands of little bitty lasers, and that means more photonics chips. The goal here is to integrate tens, hundreds, or even thousands of light sources and their interfaces onto a single silicon chip, much like we bond out the I/O pads on a conventional device. That turns out to be tricky, for many of the same reasons that stymie MEMS manufacturing or mixed-signal analog/digital devices. We’ve gone so far down the road optimizing CMOS manufacturing for purely digital circuits that it’s hard to repurpose that technology without making it either unreliable, or expensive, or both.

As any ASIC designer will tell you, most of the energy goes into the I/O interface, not the logic. Internal switching levels are very low voltage, but I/O – even on-chip I/O – still needs to be relatively high-powered. DRAM drivers, for example, can consume more energy than all the logic that constitutes the DRAM interface itself.

Similarly, it’s the interface – the Ethernet drivers, PCIe, etc. – that consume the lion’s share of a network chip’s power budget. If those interfaces can be made photonic instead of electronic, the thinking goes, then the overall power and heat taxes can be reduced, too.

Twan says that Phoenix is looking at using silicon as a photonic waveguide, which would make design and manufacturing a lot easier. But that still leaves the interface: at some point we need to convert from the silicon/digital realm to the fiber/photonic world. That’s still a sticking point, but it’s getting pushed farther and farther toward the edges of the chip. Or possibly off the chip, but within the same package.

Then there’s the truly weird quantum-computing stuff. That’s a bit beyond Phoenix’s immediate focus, but Twan believes it’s definitely on the horizon. “When the transistor was invented we never imagined we’d get the smart phone. After the century of the steam engine (the electron), we have now entered the era of the photon.”

Maybe so. It’s unlikely that Benjamin Franklin, experimenting with his kite and his key, foresaw the coming of smart phones, digital computers, or YouTube. It’s equally unlikely that we’ll know where photonics is taking us. It might remain just an interesting science experiment for several more lifetimes. Or we might be almost there. And then our grandchildren can try to explain to us how it works. 

Leave a Reply

featured blogs
Oct 19, 2020
Have you ever wondered if there may another world hidden behind the facade of the one we know and love? If so, would you like to go there for a visit?...
Oct 19, 2020
Sometimes, you attend an event and it feels like you are present at the start of a new era that will change some aspect of the technology industry. Of course, things don't change overnight. One... [[ Click on the title to access the full blog on the Cadence Community si...
Oct 16, 2020
Another event popular in the tech event circuit is PCI-SIG® DevCon. While DevCon events are usually in-person around the globe, this year, like so many others events, PCI-SIG DevCon is going virtual. PCI-SIG DevCons are members-driven events that provide an opportunity to le...
Oct 16, 2020
[From the last episode: We put together many of the ideas we'€™ve been describing to show the basics of how in-memory compute works.] I'€™m going to take a sec for some commentary before we continue with the last few steps of in-memory compute. The whole point of this web...

featured video

Demo: Inuitive NU4000 SoC with ARC EV Processor Running SLAM and CNN

Sponsored by Synopsys

See Inuitive’s NU4000 3D imaging and vision processor in action. The SoC supports high-quality 3D depth processor engine, SLAM accelerators, computer vision, and deep learning by integrating Synopsys ARC EV processor. In this demo, the NU4000 demonstrates simultaneous 3D sensing, SLAM and CNN functionality by mapping out its environment and localizing the sensor while identifying the objects within it. For more information, visit inuitive-tech.com.

Click here for more information about DesignWare ARC EV Processors for Embedded Vision

Featured Paper

The Cryptography Handbook

Sponsored by Maxim Integrated

The Cryptography Handbook is designed to be a quick study guide for a product development engineer, taking an engineering rather than theoretical approach. In this series, we start with a general overview and then define the characteristics of a secure cryptographic system. We then describe various cryptographic concepts and provide an implementation-centric explanation of physically unclonable function (PUF) technology. We hope that this approach will give the busy engineer a quick understanding of the basic concepts of cryptography and provide a relatively fast way to integrate security in his/her design.

Click here to download the whitepaper

Featured Chalk Talk

Microchip SAM11L KPH

Sponsored by Mouser Electronics and Microchip

Adding connectivity to your embedded design opens up a whole new realm of security challenges. Inviting your device to the IoT requires careful attention to building a secure foundation. In this episode of Chalk Talk, Amelia Dalton chats with Anand Rangarajan from Microchip about the SAML11-KPH MCU and how it can help you develop your application without worrying about security.

Click here for more information about Microchip Technology SAM L10/L11 ARM® Cortex®-M23 MCUs