Quantum Computing: Where This New Technology Is Headed This Decade, Part 1

General Overview

Forty years ago, Gordon Moore of Intel Corporation observed that the number of transistors on a silicon chip was doubling every year. In what became known as Moore’s Law, it was predicted that the number of transistors on a chip would continue to double at regular intervals for the foreseeable future.

Transistors are tiny on/off switches that conduct the electrical signals that get represented as “1”s and “0”s in computers. Engineers have been able to devise techniques to shrink transistors to ever smaller sizes, resulting in increasing performance and capabilities of microprocessors and memory devices. At some point an individual transistor will become so tiny that the quantum nature of electrons can result in computation errors and the transistors will no longer be reliable conductors of the electrical signal.

One possibility for overcoming this limit is to create quantum computers that can take advantage of the quantum nature of molecules and smaller particles to perform the processing tasks of a traditional computer. Quantum “processors” could possibly one day replace silicon chips, just as the transistor replaced the vacuum tube.

Quantum mechanics indicates that the world inside an atom looks very different from the world we observe around us. The seemingly impossible is possible. One quantum phenomenon is that a subatomic particle can be in two places at the same time. So, whereas a transistor is either “on” or “off”, a device made of these particles can be “on” and “off” simultaneously. Scientists are using this behavior to develop quantum computers that they hope one day will be capable of processing vast amounts of information at an extraordinary speed.

Early interest in quantum computers focuses on solving problems that seem intractable with traditional models of computation. A number of discoveries have been made, indicating that quantum computers could have some very practical applications. The Shor algorithm, devised in 1991, can factor very large numbers when used on a quantum computer – on the order of 10200 digits. This has important implications for the field of cryptography, which today is based on the difficulty of factoring very large numbers into their primes. The Grover algorithm, developed in 1994, could speed up searches of an unsorted database. Quantum computers could also be useful for simulations of quantum mechanical effects in physics, chemistry, biology and other fields.

Quantum computing is still largely theoretical and there is no agreement on the best way to build a quantum computer. We don’t know what a quantum computer capable of complex computing will look like or even if it is actually possible to build one — we have just begun the journey. We do know that whatever computing device we use will be far more powerful than the computers we use today by mid-century, just as today’s machines greatly overshadow their predecessors.

Moore’s Law

● In the mid-60s, Gordon Moore observed that the number of devices per square inch on integrated circuits had doubled every year since the integrated circuit was invented, and he predicted that this trend would continue for the foreseeable future. He recently stated he expects we have another 20 years or so before the law breaks down.

● In subsequent years, the pace slowed down a bit, but data density has doubled approximately every 20 months — the current definition of Moore’s Law.

● The original statement of Moore’s law was presented in Electronics Magazine in 1965. His observation was graphed as the logarithm of the number of components (transistors and resisters) on a chip over time. He showed a two-year doubling cycle. Moore presented a revised law at the 1975 IEEE International Electron Devices meeting. He showed the logarithm of the number of transistors on a chip over time. In the late 1980s, Moore’s Law was revised again to an 18-month cycle.

Moore’s Law at the Limit

Today’s chips

● In 1965, state-of-the-art chips contained about 60 distinct devices — transistors and resisters at the time. Intel’s latest Itanium chip has 1.7 billion transistors. Currently chips are made using a 90-nanometer process technology.

● Transistors are electronic switches made from materials that can conduct electricity when energized. On one side of the switch is the emitter or the source and the other side is the output or the drain. In between is the transistor gate. When voltage is applied on the gate, the transistor becomes conductive and current flows from the source to the drain. When there is no voltage at the gate, the transistor is non-conductive.

The problem

● At some point, silicon transistors will be so small that the source and drain will be so close it will become impossible to predict the specific location of an electron and spontaneous transmission of an electron through the gate becomes likely. According to the Heisenberg uncertainty principle, there will be no way of knowing whether an individual switch will be on or off, because it is impossible to predict the specific location of an individual electron. Transistors will then lose their reliability, because it will be impossible to control the flow of electrons, hence the creation of “1”s and “0”s needed for computing.

● In other words, at some point the size of each transistor will become so small that quantum nature of electrons in the atoms will result in computation errors.

● Experts have differing opinions as to when the ability to shrink transistors will become problematic. Intel says that around 2015, manufacturers will start to move toward hybrid chips, which combine elements of traditional transistors with newer technologies such as nanowires. A full conversion to new types of chips may not occur until the 2020s.

● Theoretically, silicon transistors can be shrunk until about the four-nanometer process generation. It’s at that point that the source and the drain will be so close that electrons will be able to drift over on their own, whether the gate is “open” or “closed.”

Why it Matters

● The proliferation of data generated from business processes, scientific research and governmental mandates, combined with the proliferation of personal computers, handheld devices and wireless devices and the internet have led to demand for ever growing speed and computing capacity.

● Improvements in chip technologies have enabled the industry to develop ever more capable electronic equipment. Among other improvements, chip vendors have been able to maintain a significant rate of miniaturization.

● At some point, manufacturers will need to adopt alternative technologies to the transistor, including carbon nanotubes, silicon nanowire transistors, molecular crossbar switches, phase change materials and spintronics.

My next chapter discusses what is next in future of quantum computing in order to better understand total value with massive and big data structure we need to deal with in next decade plus.

Dr. Eslambolchi