feature article
Subscribe Now

Richard Feynman and Quantum Computing

Forty Years On, an Idea is Becoming Reality

Richard Feynman, probably the most colourful physicist of the twentieth century, as well as one of the most important, was born a hundred years ago, on May 11th 1918. In a long career, there were some significant highlights.  Before he was thirty, he was responsible for running the computing unit at Los Alamos, where rooms full of women working with mechanical calculators carried out the complex calculations behind the first atom bombs. He also teased the military security in place by opening locked filing cabinets and strong boxes, but, perhaps more importantly, he was regarded as a touchstone – a person who would analyse your ideas and respond to them, often in the process clarifying them. One person who sought him out was the great Niels Bohr, because, according to legend, Feynman was the only person who wasn’t in awe of him and so would argue with him.

At the other end of his career, he was part of the Rogers Commission investigating the Challenger disaster – the explosion of a space shuttle that killed all seven crew. During the debate as to whether O-ring seals had failed, causing a fire, which then caused the explosion, he dropped an O-ring into ice water and demonstrated that at freezing point it lost all resilience. He famously concluded, “I believe that has some significance for our problem.” He also spoke to technicians and engineers rather than to the top management of NASA and its suppliers, and his conclusions on the overall reliability of the shuttle appeared as a separate appendix to the report – after he threatened to withdraw his name from the report unless they did appear there. In particular, he was scathing that the figures for reliability given by management were several orders of magnitude more optimistic than those of the engineers. Management put the chance of a failure at 1 in 100,000 while Feynman’s research discovered that the engineering teams rated it at more like 1 in 100.  “For a successful technology, reality must take precedence over public relations, for nature cannot be fooled.”

His Feynman Lectures on Physics was an attempt to provide a coherent introduction to physics for undergraduates, and, while the lectures were discontinued as being too challenging, they are legendarily a valuable source book and are available in a number of on-line versions, and a set of recordings of the original lectures is available as a series of audio books.

But his crowning achievement, for which he received the Nobel Prize for physics, was the work he did in the late 1940s and early 1950s on quantum electrodynamics (QED) – the way in which the elements in the electro-magnetic spectrum, such as electrons and photons, behave and how they interact. Many years later, in a lecture, he said he didn’t expect the audience to understand QED, because he didn’t either – the rules are screwy, and what he found particularly galling was that the theory relied on probability, rather than certainty. What Feynman did do was to describe QED. His description corresponded to observed behaviour with an accuracy of 15 decimal places. This, he suggested, was the difference of the distance to the moon from the top of his head compared to that from his chin. (I haven’t checked this number.) What he was describing is the behaviour of the particles within the electro-magnetic spectrum and the interaction, in particular, of light and electricity. The topic was based on work by Dirac, Heisenberg and Schrödinger in the late 1920s, but any attempt to derive a way of predicting this behaviour failed until Feynman managed to do so. Working elsewhere on the problem and producing similar results were Sin-Itiro Tomonaga and Julian Schwinger, with whom he shared the Nobel Prize in 1965.  A major part of Feynman’s contribution was the creation of what are now called Feynman diagrams, which provide a graphical model of how particles such as electrons and photons behave and interact.

As electronic computers developed in power, so theoretical physicists began to make use of them in calculations and modelling, but they often struggled to develop ways of representing things that could exist in multiple states. For example, an electron might be observable in one of two states. To simulate that one electron is simple, it is in either state A or state B. With two electrons, you have the possibility of having both in state A, both in B, one in A and the other B, or vice versa – a total of four probabilities. With ten electrons, this rises to 1,024 probabilities, and 20 has 1,048,576. But the systems that a physicist wants to investigate may have many millions of electrons, and the number of probabilities becomes unimaginably large. In the late 1970s Feynman began considering this problem, and, in a paper published in 1982, Simulating Physics with Computers, he postulated that to simulate quantum systems you would need to build quantum computers. He does this by first looking at a range of systems that you might want to simulate and showing how they cannot be adequately represented by what he calls a classical computer. As far as I can understand his arguments (there are a lot of equations), it is not an issue that can be solved simply by scaling the classical computer, or by applying massive parallelism.

And we are now seeing work on actually producing quantum computers, partly to cope with QED problems and partly to cope with the fact that we will soon be reaching the physical limitations on how small we can make a transistor that will be deterministic rather than itself being driven by quantum effects. But while a theoretical quantum computer is relatively easy to describe, it has proved much harder to implement. The problems are so great that, nearly 40 years after they were first postulated, they are still very rare beasts. And just as QED is difficult to understand, so quantum computing can also make your brain hurt.  What follows is an attempt to provide a high-level view, and it may be accused of over-simplification. Grab your favourite search engine and start digging if you find it interesting.

We all know that in digital computing we work with bits (binary digits), which are always either 0 or 1. The equivalent in quantum computing is the quantum bit or qubit. This exists as 0 or 1 – or, in the state of quantum superposition – effectively both states at the same time.  (Some descriptions say that the qubit exists in two universes.) A quantum computer of n qubits can be in 2n different states at the same time. A normal computer with the same number of bits can be in only one of 2n states at any one time.

As well as Feynman’s vision of modelling systems at the quantum level, the high number of possible states make a quantum computer a strong candidate for attacking many large data problems, including, for example, analysis of chemical interactions, speeding up searches of very large databases, solving otherwise intractable optimisation problems, and providing very fast cracking of encoded messages. This last is why governments around the world are investing in quantum computing.

Theoretically, there are multiple ways of making a qubit, but all of them are complex. The current favoured technology is to use superconductors. This requires a temperature close to absolute zero, so you need a very large and expensive piece of kit surrounding a minute area of computing. There are a number of other problems inherent in the technology. A particular problem is quantum decoherence, the tendency for a system to lose information through interaction with the environment. Associated with this is the difficulty of reading the state of the system (i.e. getting information output) without changing its state. You may remember those discussions in physics where the very act of observing something caused it to change.

There are a small number of machines in existence – very few, given how long the work has continued – and recently an arms race has begun. Last year IBM announced a 50-qubit machine – but it is not clear when this will enter service – and then in March this year Google announced a 72-qubit machine.

While the physical implementation has been slow, there has been a lot of work on algorithms that will run on machines when they are built. For example, in 1990, Peter Shor, a mathematician at MIT, published what is now known as Shore’s Algorithm for factoring large numbers exponentially – a key tool in breaking public key encryption. Some of these algorithms are now under test, as IBM has opened free access to its 20-qubit machine to researchers, with a wide range of support and tools.

Everyone in the quantum computing world is keen to make it clear that the machines are not a replacement for the classic computer, but are specialists, likely always to be an add-on to existing machines and data farms.

In the early 1970s, digital computers passed a tipping point when transistors began to be widely used, both in the processing unit and in the memory. This caused an explosion in the number of computers, with the first personal computers appearing at the end of the decade. Quantum computing has yet to reach that point. Until it does, quantum computers will be confined to the research lab, but, if the tipping point happens, then there will be a powerful new tool that will transform data processing.

59 thoughts on “Richard Feynman and Quantum Computing”

  1. You wrote, “(Quantum computing) will be a powerful new tool that will transform data processing.”

    True.

    And when cancer is cured, availability of hospital palliative care beds will rise sharply.

    Can somebody below, an imaginative way to describe the implications of Quantum Computational resources?

    Rather unimaginatively, I would say, “the implications are unbounded. Everything will change; particularly if one of the first applications of effective quantum computing is to engineer quantum computers with greater capacity.”

    But to what end? By using large capacity quantum computers, we’ll be able to model — and solve — problems that are currently so far beyond our ability to solve that we don’t even recognize them as problems (yet).

    While that might be typical of many new technologies, I think in the case of Quantum Computing, the leap will be unprecedented; it’s an inflection point.

Leave a Reply

featured blogs
Apr 25, 2024
Structures in Allegro X layout editors let you create reusable building blocks for your PCBs, saving you time and ensuring consistency. What are Structures? Structures are pre-defined groups of design objects, such as vias, connecting lines (clines), and shapes. You can combi...
Apr 25, 2024
See how the UCIe protocol creates multi-die chips by connecting chiplets from different vendors and nodes, and learn about the role of IP and specifications.The post Want to Mix and Match Dies in a Single Package? UCIe Can Get You There appeared first on Chip Design....
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...

featured video

MaxLinear Integrates Analog & Digital Design in One Chip with Cadence 3D Solvers

Sponsored by Cadence Design Systems

MaxLinear has the unique capability of integrating analog and digital design on the same chip. Because of this, the team developed some interesting technology in the communication space. In the optical infrastructure domain, they created the first fully integrated 5nm CMOS PAM4 DSP. All their products solve critical communication and high-frequency analysis challenges.

Learn more about how MaxLinear is using Cadence’s Clarity 3D Solver and EMX Planar 3D Solver in their design process.

featured paper

Designing Robust 5G Power Amplifiers for the Real World

Sponsored by Keysight

Simulating 5G power amplifier (PA) designs at the component and system levels with authentic modulation and high-fidelity behavioral models increases predictability, lowers risk, and shrinks schedules. Simulation software enables multi-technology layout and multi-domain analysis, evaluating the impacts of 5G PA design choices while delivering accurate results in a single virtual workspace. This application note delves into how authentic modulation enhances predictability and performance in 5G millimeter-wave systems.

Download now to revolutionize your design process.

featured chalk talk

Digi XBee 3 Global Cellular Solutions
Sponsored by Mouser Electronics and Digi
Adding cellular capabilities to your next design can be a complicated, time consuming process. In this episode of Chalk Talk, Amelia Dalton and Alec Jahnke from Digi chat about how Digi XBee Global Cellular Solutions can help you navigate the complexities of adding cellular connectivity to your next design. They investigate how the Digi XBee software can help you monitor and manage your connected devices and how the Digi Xbee 3 cellular ecosystem can help future proof your next design.
Nov 6, 2023
22,627 views