feature article
Subscribe Now

Richard Feynman and Quantum Computing

Forty Years On, an Idea is Becoming Reality

Richard Feynman, probably the most colourful physicist of the twentieth century, as well as one of the most important, was born a hundred years ago, on May 11th 1918. In a long career, there were some significant highlights.  Before he was thirty, he was responsible for running the computing unit at Los Alamos, where rooms full of women working with mechanical calculators carried out the complex calculations behind the first atom bombs. He also teased the military security in place by opening locked filing cabinets and strong boxes, but, perhaps more importantly, he was regarded as a touchstone – a person who would analyse your ideas and respond to them, often in the process clarifying them. One person who sought him out was the great Niels Bohr, because, according to legend, Feynman was the only person who wasn’t in awe of him and so would argue with him.

At the other end of his career, he was part of the Rogers Commission investigating the Challenger disaster – the explosion of a space shuttle that killed all seven crew. During the debate as to whether O-ring seals had failed, causing a fire, which then caused the explosion, he dropped an O-ring into ice water and demonstrated that at freezing point it lost all resilience. He famously concluded, “I believe that has some significance for our problem.” He also spoke to technicians and engineers rather than to the top management of NASA and its suppliers, and his conclusions on the overall reliability of the shuttle appeared as a separate appendix to the report – after he threatened to withdraw his name from the report unless they did appear there. In particular, he was scathing that the figures for reliability given by management were several orders of magnitude more optimistic than those of the engineers. Management put the chance of a failure at 1 in 100,000 while Feynman’s research discovered that the engineering teams rated it at more like 1 in 100.  “For a successful technology, reality must take precedence over public relations, for nature cannot be fooled.”

His Feynman Lectures on Physics was an attempt to provide a coherent introduction to physics for undergraduates, and, while the lectures were discontinued as being too challenging, they are legendarily a valuable source book and are available in a number of on-line versions, and a set of recordings of the original lectures is available as a series of audio books.

But his crowning achievement, for which he received the Nobel Prize for physics, was the work he did in the late 1940s and early 1950s on quantum electrodynamics (QED) – the way in which the elements in the electro-magnetic spectrum, such as electrons and photons, behave and how they interact. Many years later, in a lecture, he said he didn’t expect the audience to understand QED, because he didn’t either – the rules are screwy, and what he found particularly galling was that the theory relied on probability, rather than certainty. What Feynman did do was to describe QED. His description corresponded to observed behaviour with an accuracy of 15 decimal places. This, he suggested, was the difference of the distance to the moon from the top of his head compared to that from his chin. (I haven’t checked this number.) What he was describing is the behaviour of the particles within the electro-magnetic spectrum and the interaction, in particular, of light and electricity. The topic was based on work by Dirac, Heisenberg and Schrödinger in the late 1920s, but any attempt to derive a way of predicting this behaviour failed until Feynman managed to do so. Working elsewhere on the problem and producing similar results were Sin-Itiro Tomonaga and Julian Schwinger, with whom he shared the Nobel Prize in 1965.  A major part of Feynman’s contribution was the creation of what are now called Feynman diagrams, which provide a graphical model of how particles such as electrons and photons behave and interact.

As electronic computers developed in power, so theoretical physicists began to make use of them in calculations and modelling, but they often struggled to develop ways of representing things that could exist in multiple states. For example, an electron might be observable in one of two states. To simulate that one electron is simple, it is in either state A or state B. With two electrons, you have the possibility of having both in state A, both in B, one in A and the other B, or vice versa – a total of four probabilities. With ten electrons, this rises to 1,024 probabilities, and 20 has 1,048,576. But the systems that a physicist wants to investigate may have many millions of electrons, and the number of probabilities becomes unimaginably large. In the late 1970s Feynman began considering this problem, and, in a paper published in 1982, Simulating Physics with Computers, he postulated that to simulate quantum systems you would need to build quantum computers. He does this by first looking at a range of systems that you might want to simulate and showing how they cannot be adequately represented by what he calls a classical computer. As far as I can understand his arguments (there are a lot of equations), it is not an issue that can be solved simply by scaling the classical computer, or by applying massive parallelism.

And we are now seeing work on actually producing quantum computers, partly to cope with QED problems and partly to cope with the fact that we will soon be reaching the physical limitations on how small we can make a transistor that will be deterministic rather than itself being driven by quantum effects. But while a theoretical quantum computer is relatively easy to describe, it has proved much harder to implement. The problems are so great that, nearly 40 years after they were first postulated, they are still very rare beasts. And just as QED is difficult to understand, so quantum computing can also make your brain hurt.  What follows is an attempt to provide a high-level view, and it may be accused of over-simplification. Grab your favourite search engine and start digging if you find it interesting.

We all know that in digital computing we work with bits (binary digits), which are always either 0 or 1. The equivalent in quantum computing is the quantum bit or qubit. This exists as 0 or 1 – or, in the state of quantum superposition – effectively both states at the same time.  (Some descriptions say that the qubit exists in two universes.) A quantum computer of n qubits can be in 2n different states at the same time. A normal computer with the same number of bits can be in only one of 2n states at any one time.

As well as Feynman’s vision of modelling systems at the quantum level, the high number of possible states make a quantum computer a strong candidate for attacking many large data problems, including, for example, analysis of chemical interactions, speeding up searches of very large databases, solving otherwise intractable optimisation problems, and providing very fast cracking of encoded messages. This last is why governments around the world are investing in quantum computing.

Theoretically, there are multiple ways of making a qubit, but all of them are complex. The current favoured technology is to use superconductors. This requires a temperature close to absolute zero, so you need a very large and expensive piece of kit surrounding a minute area of computing. There are a number of other problems inherent in the technology. A particular problem is quantum decoherence, the tendency for a system to lose information through interaction with the environment. Associated with this is the difficulty of reading the state of the system (i.e. getting information output) without changing its state. You may remember those discussions in physics where the very act of observing something caused it to change.

There are a small number of machines in existence – very few, given how long the work has continued – and recently an arms race has begun. Last year IBM announced a 50-qubit machine – but it is not clear when this will enter service – and then in March this year Google announced a 72-qubit machine.

While the physical implementation has been slow, there has been a lot of work on algorithms that will run on machines when they are built. For example, in 1990, Peter Shor, a mathematician at MIT, published what is now known as Shore’s Algorithm for factoring large numbers exponentially – a key tool in breaking public key encryption. Some of these algorithms are now under test, as IBM has opened free access to its 20-qubit machine to researchers, with a wide range of support and tools.

Everyone in the quantum computing world is keen to make it clear that the machines are not a replacement for the classic computer, but are specialists, likely always to be an add-on to existing machines and data farms.

In the early 1970s, digital computers passed a tipping point when transistors began to be widely used, both in the processing unit and in the memory. This caused an explosion in the number of computers, with the first personal computers appearing at the end of the decade. Quantum computing has yet to reach that point. Until it does, quantum computers will be confined to the research lab, but, if the tipping point happens, then there will be a powerful new tool that will transform data processing.

38 thoughts on “Richard Feynman and Quantum Computing”

  1. You wrote, “(Quantum computing) will be a powerful new tool that will transform data processing.”

    True.

    And when cancer is cured, availability of hospital palliative care beds will rise sharply.

    Can somebody below, an imaginative way to describe the implications of Quantum Computational resources?

    Rather unimaginatively, I would say, “the implications are unbounded. Everything will change; particularly if one of the first applications of effective quantum computing is to engineer quantum computers with greater capacity.”

    But to what end? By using large capacity quantum computers, we’ll be able to model — and solve — problems that are currently so far beyond our ability to solve that we don’t even recognize them as problems (yet).

    While that might be typical of many new technologies, I think in the case of Quantum Computing, the leap will be unprecedented; it’s an inflection point.

Leave a Reply

featured blogs
Apr 14, 2021
Hybrid Cloud architecture enables innovation in AI chip design; learn how our partnership with IBM combines the best in EDA & HPC to improve AI performance. The post Synopsys and IBM Research: Driving Real Progress in Large-Scale AI Silicon and Implementing a Hybrid Clou...
Apr 13, 2021
The human brain is very good at understanding the world around us.  An everyday example can be found when driving a car.  An experienced driver will be able to judge how large their car is, and how close they can approach an obstacle.  The driver does not need ...
Apr 13, 2021
If a picture is worth a thousand words, a video tells you the entire story. Cadence's subsystem SoC silicon for PCI Express (PCIe) 5.0 demo video shows you how we put together the latest... [[ Click on the title to access the full blog on the Cadence Community site. ]]...
Apr 12, 2021
The Semiconductor Ecosystem- It is the definition of '€œHigh Tech'€, but it isn'€™t just about… The post Calibre and the Semiconductor Ecosystem appeared first on Design with Calibre....

featured video

Learn the basics of Hall Effect sensors

Sponsored by Texas Instruments

This video introduces Hall Effect, permanent magnets and various magnetic properties. It'll walk through the benefits of Hall Effect sensors, how Hall ICs compare to discrete Hall elements and the different types of Hall Effect sensors.

Click here for more information

featured paper

Understanding Functional Safety FIT Base Failure Rate Estimates per IEC 62380 and SN 29500

Sponsored by Texas Instruments

Functional safety standards such as IEC 61508 and ISO 26262 require semiconductor device manufacturers to address both systematic and random hardware failures. Base failure rates (BFR) quantify the intrinsic reliability of the semiconductor component while operating under normal environmental conditions. Download our white paper which focuses on two widely accepted techniques to estimate the BFR for semiconductor components; estimates per IEC Technical Report 62380 and SN 29500 respectively.

Click here to download the whitepaper

Featured Chalk Talk

Benefits of FPGAs & eFPGA IP in Futureproofing Compute Acceleration

Sponsored by Achronix

In the quest to accelerate and optimize today’s computing challenges such as AI inference, our system designs have to be flexible above all else. At the confluence of speed and flexibility are today’s new FPGAs and e-FPGA IP. In this episode of Chalk Talk, Amelia Dalton chats with Mike Fitton from Achronix about how to design systems to be both fast and future-proof using FPGA and e-FPGA technology.

Click here for more information about the Achronix Speedster7 FPGAs