feature article
Subscribe Now

Unleashing the Power of Quantum Computing

Before we dive into this topic with gusto and abandon (and aplomb, of course), it’s probably only fair for me to inform you that I don’t have a clue what I’m about to be talking about, if you see what I mean. “So, how does this differ from your other columns,” I hear you mutter under your breath. I’m obliged to admit that you have me there, and yet you are still reading, so ten points to me, I think.

Now, you probably think that my admitted lack of knowledge will result in a somewhat short column. Well, it may, or it may not, but—based on prior experience—I wouldn’t bet on it either way if I were you.

What do you know about quantum computing and quantum computers? Take a moment to jot things down…

That didn’t take long, did it? I just performed a survey that involved me running around the building in which I have my office asking everyone who couldn’t run away fast enough “Have you heard about quantum computers?” I was surprised to discover that quite a few people had heard this term. I was less surprised to discover that none of them knew anything more. When I asked one person who said he’d heard about them, “What do you know about them?” he replied, “What? They exist?” It turned out he’d been exposed to the concept in a science fiction film, resulting in him thinking that quantum computers were the stuff only of science fiction.

At this point I was tempted to start throwing some statistics around but, as Homer Simpson famously said, “People can come up with statistics to prove anything… forty percent of people know that.”

 

Also, my own faith in statistics was degraded when I read the classic How to Lie with Statistics by Darrell Huff, which I have sitting on the bookshelves here in my office, so let us proceed sans statistics.

I’m going to go out on a limb here by saying I believe most people know nothing about quantum computing other than the name (if that). I’d go further to say that even the majority of people with an interest in science, technology, and engineering know little more than the fact the basic unit of quantum information is the quantum bit, or qubit. Also, they’ve probably heard that a quantum computer can solve in seconds problems that would take classical computers anywhere from thousands of millions of years to solve, assuming classical computers could solve such problems at all.

If you really are starting at ground zero, then there was a 13-minute segment of 60 Minutes a couple of weeks ago that might prove interesting.

 

As is often the case in this sort of thing, American theoretical physicist, activist, futurologist, and popular-science writer, Michio Kaku, makes an appearance. The poignant point for me was at the end of the video when Michio says, “The language of the universe is the language of the quantum.” It’s a shame I have no talent for languages.

But what does this all mean? Well, in a classical computer, the fundamental unit of information is the bit, which can adopt one of two states: 0 or 1. As we already noted, in a quantum computer, the fundamental unit of information is the qubit. In a way, a qubit is also a 2-state system in that it involves something like electron spin (up or down) or photon polarization (left-handed or right-handed).

In a classical system, a bit is in one state or the other (unless it’s metastable, in which case all bets are off). In a quantum system, a qubit exists in a coherent superposition of both states simultaneously, which basically means it represents all possible values at once (and then things start to get complicated).

Some problems are easy, and other problems are hard. An example of the latter is called the travelling salesman problem (TSP). This starts by asking the following question: “Given a list of cities and the distances between each pair of cities, what is the shortest possible route that visits each city exactly once before returning to the original city?” Suffice it to say that solving this problem—which is classed as an NP-Hard problem—using a classical computer is a lot harder than you might think.

Another interesting problem, one that is illustrated in the video above, involves a mouse solving a maze. If we model this on a classical computer, then the mouse exhaustively attempts one path after another, reversing direction when it meets a dead end, before finally emerging triumphant at the exit. By comparison, the way a quantum computer solves this problem—at least as described in the video—is to look at all possible paths simultaneously and select the right one. This isn’t the way it really works, but we will return to that in a moment.

One of the things that puzzled scientists for a long time is the efficiency of photosynthesis in green plants and cyanobacteria. Specifically, how they manage to achieve close to 100% efficiency when transferring the energy in photons of sunlight to molecular reaction centers for conversion into chemical energy. It turns out “quantum” is the answer. I remember reading an article several years ago that explained how scientists fired photons of light at a photosynthetic cell while also hitting it with femtosecond pulses of laser light. By this means, they observed what appeared to be each photon “looking ahead” to check out all possible paths from a receptor on the cell’s surface to a molecular reaction center inside the cell before settling on the lowest energy path. The next photon hitting the same receptor might take a completely different path to the same destination. Once again, this “looking ahead” analogy isn’t the way things really work and, once again, we will return to this in a moment.

One final example of why all this is of interest involves protein folding, which is the physical process whereby a protein chain is translated from a randomly-shaped string into its native three-dimensional structure, which corresponds to its lowest energy configuration. As mind-blowingly complicated as this is, it becomes exponentially more complex when multiple proteins are interacting with each other. At this point I’d like to refer you to one of my favorite books of all time: Wetware: A Computer in Every Living Cell by Dennis Bray.

These are the sorts of problems that classical computers take forever to solve, while quantum computers offer the promise of being able to solve the same problems in seconds. Of course, they also offer the promise of being able to solve problems that were best left unsolved, like cracking cryptographic keys, but that’s a topic for another day.

So, are quantum computers real? Well, yes, sort of. Modern quantum theory developed in the 1920s to explain the wave-particle duality observed at atomic scales, where wave-particle duality refers to the fact that quantum entities exhibit particle or wave properties according to the experimental circumstances by which they are being observed and measured.

The concept of quantum computers was proposed in the 1980s by Richard Feynman and Yuri Manin. In 1994, Peter Shor showed that a quantum computer would be able to break RSA encryption, which caused a lot of people to start wearing their frowny faces. The first real quantum computer that could be loaded with real data and output a real solution was a 2-qubit machine created in 1998 by Isaac Chuang of the Los Alamos National Laboratory, Neil Gershenfeld of the Massachusetts Institute of Technology (MIT), and Mark Kubinec of the University of California at Berkeley.

What’s the current state-of-the-art? I have no idea. All I know is that IBM unveiled its 433 qubit Osprey processor in November 2022, and Atom Computing announced an 1180 qubit machine in October 2023.

The reason I’m waffling on about all this is that I was just chatting with Earl Campbell, who is VP of Quantum Science at Riverlane. We started by my telling Earl what I knew of quantum computing, and his telling me that everything I thought was wrong (but he was jolly nice, and he was smiling as he said it, and—unlike my wife (Gina the Gorgeous)—he didn’t imply that I was a complete knucklehead, so that was all right).

A futuristic glowing CPU quantum computer processor. 3D illustration.

One way to visualize a next-generation quantum computer (Source: Riverlane)

One of the things I’ve long been confused about was how to make any sense of the results from a quantum computation. Earl explained that we begin with a classical binary representation on the inputs to the machine, we have the quantum monster in the middle of the machine, and the results are presented in a classical binary representation on the outputs of the machine. In the case of the quantum portion of the machine, my new understanding is that thinking in terms of fixed-size floating-point numbers is meaningless. The way I now think about each qubit is that it represents an imaginary number with infinite precision (of course, I may be wrong). The thing is that, as explained by Heisenberg’s uncertainty principle (which states that there is a limit to the precision with which certain pairs of physical properties, such as position and momentum, can be simultaneously known), we can’t really tell what’s happening in the quantum part of the machine, we just have to say “Ooh” and “Aah” when it presents us with the results.

Another thing Earl said was that the maze analogy I mentioned earlier was fundamentally wrong. The way to think about things is to start by considering what happens when we randomly drop a handful of variably-sized pebbles into a pond. Each pebble will generate ripples. The ripples from all the pebbles will combine (interfere with each other) in weird and wonderful (constructive and destructive) ways. Returning to the maze (or any other quantum problem), the quantum elements start in all states simultaneously, each state combines with every other state constructively and destructively, and the system wave function collapses to provide the answer, which is always 42 (I think that’s what Earl said).

Unfortunately, there’s a fly in the soup and an elephant in the room (I never metaphor I didn’t like) that manifests itself in the form of quantum noise, which leads to quantum errors. In the case of classical computers, we tend to think about our 0s and 1s as corresponding to two different voltages—let’s say 0V and 5V, respectively, which shows how old I am. In reality, our 0s and 1s correspond to voltage bands—so anything below 1V corresponds to a logic 0, while anything about 4V corresponds to a logic 1, for example. Now, although we don’t like to think about it, errors occur in our classical digital computers all the time, like bits flipping in memory due to random radiation, for example. The answer is to employ things like error correcting code (ECC) memory with additional bits used to detect and correct errors.

Do you remember VHS video cassettes, which were analog in nature? If you took a video at a party and made a copy for a friend, and that friend made a copy for another friend, and that friend… you see where I’m going. It didn’t take long before replication errors compounded making the later copies unwatchable. By comparison, a digital representation like a CD or DVD includes error detecting and correcting codes that maintain the fidelity of the data, which means you can make copies of copies ad infinitum, with the last copy being identical to the original (at least in terms of its 0s and 1s).

Now think about quantum computers with their qubits being in every state at once (sort of—I can imagine Earl wincing as he reads this), and quantum noise, and quantum errors. Can we detect and correct such errors? Once again, Heisenberg’s uncertainty principle comes into play, because trying to observe and measure the state of a quantum system like a qubit changes its state causing its wave function to collapse.

To be honest, for a long time a lot of people thought this was going to prove to be an unsolvable problem, all the way until the boffins at Riverlane solved it. As Earl told me, “This is what Riverlane is working on. Solving this problem involves working with massive volumes of data, which can be hundreds of terabytes of data every second. We like to compare it to the volume of traffic on Netflix. The entire global traffic on Netflix would be the same amount of data that you’ll be looking at to run a commercial-grade quantum computer and decoding that.” 

All of which leads us to the fact that the guys and gals at Riverlane have announced The World’s Most Powerful Quantum Decoder and The World’s First Quantum Error Correction Chip.

One way to visualize a next-generation quantum computer (Source: Riverlane)

I know that I’ve waffled far too long on a subject I know nothing about, so let’s summarize things as follows. Quantum error correction (QEC) is one of the worst kept secrets as to what’s holding quantum computing back. QEC is a major obstacle in the way of practical quantum scaling. Without error correction, there is no path for useful quantum computers.

In a crunchy nutshell, QEC is a set of techniques used to protect the information stored in qubits from errors and decoherence caused by noise. QEC involves generating a continuous stream of data, and a sophisticated algorithmic process called “decoding” is needed to process this data.

The chaps and chapesses at Riverlane have a singular focus on QEC. They recently introduced a decoder chip in the form of an FPGA that demonstrates how QEC can help scale quantum computers to useful implementations. In fact, the little scamps have just published a paper in the prestigious journal Nature on this very topic.

All I can say is that I, for one, am (a) very impressed and (b) very confused. I have no idea how people wrap their brains around this stuff. I hope to visit the folks at Riverlane sometime. Until that frabjous day, I fear I will imagine this as a company composed of Sheldon Coopers and Amy Farrah Fowlers. What say you? What are your thoughts on quantum computing?

Leave a Reply

featured blogs
Apr 26, 2024
LEGO ® is the world's most famous toy brand. The experience of playing with these toys has endured over the years because of the innumerable possibilities they allow us: from simple textbook models to wherever our imagination might take us. We have always been driven by ...
Apr 26, 2024
Biological-inspired developments result in LEDs that are 55% brighter, but 55% brighter than what?...
Apr 25, 2024
See how the UCIe protocol creates multi-die chips by connecting chiplets from different vendors and nodes, and learn about the role of IP and specifications.The post Want to Mix and Match Dies in a Single Package? UCIe Can Get You There appeared first on Chip Design....

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured paper

Designing Robust 5G Power Amplifiers for the Real World

Sponsored by Keysight

Simulating 5G power amplifier (PA) designs at the component and system levels with authentic modulation and high-fidelity behavioral models increases predictability, lowers risk, and shrinks schedules. Simulation software enables multi-technology layout and multi-domain analysis, evaluating the impacts of 5G PA design choices while delivering accurate results in a single virtual workspace. This application note delves into how authentic modulation enhances predictability and performance in 5G millimeter-wave systems.

Download now to revolutionize your design process.

featured chalk talk

It’s the little things that get you; Light to Voltage Converters
In this episode of Chalk Talk, Amelia Dalton and Ed Mullins from Analog Devices chat about the what, where, and how of photodiode amplifiers. They discuss the challenges involved in designing these kinds of components, the best practices for analyzing the stability of photodiode amplifiers, and how Analog Devices can help you with your next photodiode amplifier design.
Apr 22, 2024
947 views