Computing: Two States of Being
Quantum computing is a growing interdisciplinary subject that could drastically change computing as we know it. Current encryption schemes could be forcibly broken, thus compromising the security of the world as we know it. Search algorithms could speedup substantially over any algorithm known of in a regular computer, leading to more efficient research and information gathering. Particles and atoms could be simulated in ways that reveal the behaviour of the universe, when previously it was infeasible to do so. Quantum computing has not yet reached these points yet, but the potential alone makes it worthwhile to investigate the fundamentals and origins of the field.
Comparing quantum computers to classical computers is difficult. Both are computers, but the science that is fundamental to how these computers work is vastly different. Classical computers depend on bits represented by an electrical voltage. Each bit representing one of two states, one or zero. Quantum computers depend on the use of a Qubit, which can have the states of a bit but in a quantum-mechanical system that allows it to be in a superposition of both states simultaneously. The qubits allow for the amazing potential in Quantum Computing. Unfortunately, the ability for a qubit to have a superposition as a state is incredibly difficult to maintain. If a quantum system is exposed to too much environmental interference, then the superposition of the system would have to be measured, forcing the system to become one of the two states. This fundamental difference in computers means that quantum computers are unlikely to replace our personal classical computers. Quantum Computers will likely be used for more complex operations, and still has a lot of research to go through.
Quantum computing is a relatively new idea since quantum mechanics itself is emerging and did not originate with computing in mind. The field of quantum mechanics came from attempts to solve unknown problems in the world of classical physics. The field started gaining more attention when studying black body radiation around 1900. The earliest public record of using quantum effects for computation came from Richard Feynman in 1959. Significant notice was not given to this idea until the 1980’s when Richard Feynman and Yuri Manin pushed that quantum computers could simulate things that would be impossible on a classical computer. Feynman specifically said “"Nature isn't classical, dammit, and if you want to make a simulation of nature, you'd better make it quantum mechanical, and by golly, it's a wonderful problem because it doesn't look so easy." These ideas were picked up by a man named Peter Shor who, in 1994, developed a quantum algorithm that could break most public key encryption. This algorithm ended the 20th century by revealing the power behind this new form of computing and ignited interest into Quantum Computing Research.
Since the beginning of the 21st century, quantum computing has made major strides. Many universities and companies have invested into the development of real quantum computers, and over the years, the qubit count has steadily grown, with a current record of a 72 qubit chip developed by Google. With this increased development of more complex quantum computers, there has also been a shift in trying to open quantum computing up to the commercial market. At the end of 2017, Microsoft released Q#, a language for quantum algorithms that shipped with a quantum simulator to run it. In January of 2019, IBM announced one of the first commercial quantum computers. However, these developments are not the end of the road. Quantum supremacy is the point at which quantum computers provide a more than polynomial speedup over the best classical algorithms, and this point is currently still out of reach. For now, it seems that computing will exist in two states at once, both classical and quantum.