Quantum Computing – Where are We Heading with This New technology in This Decade

Dr. Hossein Eslambolchi
April 2012

Overview

Forty years ago Gordon Moore of Intel Corporation observed that the number of transistors on a silicon chip was doubling every year. In what became known as Moore’s Law, it was predicted that the number of transistors on a chip would continue to double at regular intervals for the foreseeable future.

Transistors are tiny on/off switches that conduct the electrical signals that get represented as “1”s and “0”s in computers. Engineers have been able to devise techniques to shrink transistors to ever smaller sizes, resulting in increasing performance and capabilities of microprocessors and memory devices. At some point an individual transistor will become so tiny that the quantum nature of electrons can result in computation errors and the transistors will no longer be reliable conductors of the electrical signal.

One possibility for overcoming this limit is to create quantum computers that can take advantage of the quantum nature of molecules and smaller particles to perform the processing tasks of a traditional computer. Quantum “processors” could possibly one day replace silicon chips, just as the transistor replaced the vacuum tube.

Quantum mechanics indicates that the world inside an atom looks very different from the world we observe around us. The seemingly impossible is possible. One quantum phenomenon is that a subatomic particle can be in two places at the same time. So, whereas a transistor is either “on” or “off”, a device made of these particles can be “on” and “off” simultaneously. Scientists are using this behavior to develop quantum computers that they hope one day will be capable of processing vast amounts of information at an extraordinary speed.

Early interest in quantum computers focuses on solving problems that seem intractable with traditional models of computation. A number of discoveries have been made, indicating that quantum computers could have some very practical applications. The Shor algorithm, devised in 1991, can factor very large numbers when used on a quantum computer – on the order of 10200 digits. This has important implications for the field of cryptography which today is based on the difficulty of factoring very large numbers into their primes. The Grover algorithm, developed in 1994, could speed up searches of an unsorted database. Quantum computers could also be useful for simulations of quantum mechanical effects in physics, chemistry, biology and other fields.

Quantum computing is still largely theoretical and there is no agreement on the best way to build a quantum computer. We don’t know what a quantum computer capable of complex computing will look like or even if it is actually possible to build one — we have just begun the journey. We do know that, whatever the computing device we are using, by the mid-century it will be far more powerful than the computers we use today — just as today’s machines greatly overshadow their predecessors.

Moore’s Law

  • In the mid-60s, Gordon Moore observed that the number of devices per square inch on integrated circuits had doubled every year since the integrated circuit was invented, and he predicted that this trend would continue for the foreseeable future. He recently stated he expects we have another 20 years or so before the law breaks down.
  • In subsequent years, the pace slowed down a bit, but data density has doubled approximately every 20 months — the current definition of Moore’s Law.
  • The original statement of Moore’s law was presented in Electronics Magazine in 1965. His observation was graphed as the logarithm of the number of components (transistors and resisters) on a chip over time. He showed a two-year doubling cycle. Moore presented a revised law at the 1975 IEEE International Electron Devices meeting. He showed the logarithm of the number of transistors on a chip over time. In the late 1980s, Moore’s Law was revised again to an 18-month cycle.

 

Moore’s Law at the Limit

Today’s chips

  • In 1965, state-of-the-art chips contained about 60 distinct devices — transistors and resisters at the time. Intel’s latest Itanium chip has 1.7 billion transistors. Currently chips are made using a 90-nanometer process technology.[1]
  • Transistors are electronic switches made from materials that can conduct electricity when energized. On one side of the switch is the emitter or the source and other side is the output or the drain. In between is the transistor gate. When voltage is applied on the gate, the transistor becomes conductive and current flows from the source to the drain. When there is no voltage at the gate, the transistor is non-conductive.

The problem

  • At some point silicon transistors will be so small that the source and drain will be so close it will become impossible to predict the specific location of an electron. Spontaneous transmission of an electron through the gate becomes likely. According to the Heisenberg uncertainty principle, there will be no way of knowing whether an individual switch will be on or off, because it is impossible to predict the specific location of an individual electron. Transistors will then lose their reliability, because it will be impossible to control the flow of electrons and hence the creation of “1”s and “0”s needed for computing.
  • In other words, at some point the size of each transistor will become so small that quantum nature of electrons in the atoms will result in computation errors.
  • Experts have differing opinions as to when the ability to shrink transistors will become problematic. Intel says that around 2015, manufacturers will start to move toward hybrid chips, which combine elements of traditional transistors with newer technologies such as nanowires. A full conversion to new types of chips may not occur until the 2020s.
  • Theoretically, silicon transistors can be shrunk until about the four-nanometer process generation. It’s at that point that the source and the drain will be so close that electrons will be able to drift over on their own, whether the gate is “open” or “closed”.

Why it Matters

  • The proliferation of data generated from business processes, scientific research, and governmental mandates, combined with the proliferation of personal computers, handheld devices, and wireless devices and the Internet have led to demand for ever growing speed and computing capacity.
  • Improvements in chip technologies have enabled the industry to develop ever more capable electronic equipment. Among other improvements, chip vendors have been able to maintain a significant rate of miniaturization.
  • But, at some point, manufacturers will need to adopt alternative technologies to the transistor. Alternatives include carbon nanotubes, silicon nanowire transistors, molecular crossbar switches, phase change materials, and spintronics.

 

What’s next?

Atoms Instead of Chips

  • One possibility for the future is the quantum computer, which takes advantage of the quantum nature of subatomic particles to perform the memory and processing tasks of a classical computer. Quantum “processors” could possibly one day replace silicon chips, just as the transistor replaced the vacuum tube.

 

The Qubit and the Principle of Superposition

  • Quantum computers are based on the principles of quantum mechanics — mainly on the principle of superposition, which says that a particle does not exist in any single state but is an amalgamation of all its energy states.
  • Quantum computers work by manipulating quantum bits, or qubits. Qubits are the counterpart to the computer bit we know today, where the transistor in the “on” or “off” state gives a value of “1” or “0”. Just as bits are the basic unit or building block of classical computing, qubits are the basic unit for quantum computers and have two basic states giving the value of “1” or “0”.[2] In a quantum computer, particles such as the electron or photon can be used as qubits, with either their charge or polarization representing the two basic states. (Ions have also been used in experimental models.)
  • Given the principle of superposition, one qubit can be in both the “1” or “0” states at the same time. With two qubits, you can have a superposition of four states “00”, “01”, “10”, and “11”. This means that four calculations can be performed at once. Adding more and more qubits, the number of simultaneous calculations grows dramatically. In fact the number of calculations scales exponentially with each added qubit – 2N, where N is the number of qubits. Compare this with the linear scaling of a classical computer, where each bit is a “1” or “0”. Quantum computers could therefore theoretically perform certain types of computations much faster than any computer of today.

Qubits and the Principle of Entanglement

  • Another quantum principle for quantum computing is called entanglement, which says that two particles that have interacted at some point can become correlated so that whatever happens to one particle would immediately affect the other. The two particles will predictably interact with each other regardless of how far apart they might be, as long as they remain in isolation. Einstein called this “spooky action at a distance.”
  • Quantum mechanics show that whereas a particle can be in two states simultaneously – as with the qubit – once you measure the particle, it settles into either one state or the other. At the instant one entangled particle settles into a particular state, the other particle instantly locks into the opposite. So, the measurement on any one particle will provide information on the state of its mate – it will be the opposite.
  • Some scientists believe that this phenomenon and quantum superposition can create greatly enhanced computing capabilities.

 

Quantum Computing Applications

  • Early interest in quantum computers is in solving problems that seem intractable with traditional models of computation.
  • In 1994, Peter Shor, a scientist in AT&T research labs, developed a quantum computer algorithm to factor large numbers – on the order of 10200 digits. A major application for this algorithm is encryption. With the Shor algorithm, a quantum computer would be able to crack codes much more quickly than a classical computer could. Public key cryptosystems, such as RSA, that rely on the difficulty of factoring very large numbers into their primes would become obsolete if the Shor algorithm were implemented. In quantum cryptography, protocols using quantum key exchanges could take advantage of the phenomenon of entanglement to ensure that only the sender and the intended recipient could read a message.
  • In 1996, Luv Grover developed a quantum algorithm that — theoretically — could dramatically speed up a database search. The number of steps need in the Glover algorithm is defined by the square root of the number of items in the database. Using this algorithm, a quantum computer could search an unsorted database to locate a specific entry much more efficiently than a conventional computer can.
  • Quantum computers could also be useful for simulations of quantum mechanical effects in physics, chemistry, biology and other fields.

 

Building a Quantum Computer

  • We still don’t know how to build a quantum computer. One huge problem is the particles used for calculations must remain in isolation from their surroundings. Because of the consequences created by entanglement, interactions with outside particles could result in faulty results.
  • Quantum physicists are working on a number of methods for controlling qubits: atoms or charged ions in an electro-magnetic trap, nuclear magnetic resonance, superconductor microcircuits, quantum dots, atom conveyors, and cavity quantum electrodynamics, among others.
  • Quantum computing is still largely theoretical and there is no agreement on the best way to build a quantum computer. We don’t know what a quantum computer capable of complex computing will look like — or even if it is actually possible to build one. We have just begun the journey. We do know whatever the computing device we are using by the mid-century, it will be far more powerful than the computers we use today, just as today’s machines greatly overshadow their predecessors.

 


Appendix I: Diagram of a Quantum Computer

Source: USA Today, Beyond the PC: Atomic QC

Other Descriptions of Quantum Computing

George Johnson, A Shortcut Through Time: The Path To The Quantum Computer, Alfred A. Knopf: 2003.

“In the tiny spaces inside atoms, the ordinary rules of reality … no longer hold. Defying all common sense, a single particle can be in two places at the same time. And so, while a switch in a conventional computer can be either on or off, representing 1 or 0, a quantum switch can paradoxically be in both states at the same time, saying 1 and 0…. Therein lies the source of the power.”

Jakob Reichel,” “Atom Chips,” Scientific American, February 2005: Vol. 292, Issue 2.

“Today’s star in the quantum scene is the quantum computer. This future device would exploit the superposition principle (another peculiar feature of the quantum world) to carry out certain types of computations much faster than any classical computer could do. A quantum computer functions by manipulating qubits, the quantum counterparts to bits. An ordinary, classical (non-quantum) logical bit can only be true or false, 1 or 0. The qubit, by contrast, can be in a superposition state corresponding to any mixture of true and false at the same time, like Schrödinger’s cat in its mixture of alive and dead.

In a classical computer, computations corresponding to different bit states must be carried out one after the other. With qubits, they are elegantly performed all at the same time. It has been proven that for certain problems this feature makes a quantum computer fundamentally faster than any classical computer can ever be.

The favorite occupation of quantum physicists these days is to think of practical ways to make a quantum computer: with trapped ions, with large molecules, with electron spins–or maybe with BECs on atom chips. The idea is tempting because such a quantum chip seems so attractively similar to a traditional microelectronics chip and at the same time so radically new. Components such as the atom conveyor could be used to bring qubits together to interact in a controllable fashion.

Thus, the condensate on a chip is the beginning of a story. As so often occurs in science, the plot of the story is not known in advance, and the actors themselves are discovering it in little steps. As in the past, surprises will crop up–pleasant and unpleasant ones. Some obstacles will be removed; others will force researchers to change directions. Whatever we find out will help to bring the classical and quantum worlds still closer together on the stage of science.”

Aaron Ricadela, “Quantum’s Leap,” InformationWeek, May 10, 2004


[1] The size of molecules ranges from about 0.1 nanometer for simple molecules up to about 50 nanometers for complicated biological macromolecules such as proteins and enzymes. In comparison, a human hair is 150,000 nm in diameter and represents the smallest feature an unaided human eye can see. A water molecule is about 0.3 nanometers in diameter.

[2] Qubits can exist in a superposition that is simultaneously both 1 and 0 or somewhere in between.