Photo credit :https://www.google.com/url?sa=i&rct=j&q=&esrc=s&source=images&cd=&cad=rja&uact=8&ved=0ahUKEwiW3oeLs6HZAhVF6IMKHWCTDpEQjRwIBw&url=https%3A%2F%2Fwww.newscientist.com%2Farticle%2Fmg23130894-000-revealed-googles-plan-for-quantum-computer-supremacy%2F&psig=AOvVaw064IDLXhPRBCiwxmqB7ivc&ust=1518559976862861
When most people hear the word ‘Quantum’ they think of a frizzy gray- haired professor spouting off physics terms while students desperately try to keep up. The term itself, however, is quite simple. Quantum is a word which in latin refers to “quantity”. This usually refers to microscopic particles which have different properties than what is known as classical physics, based mostly on Newton. These small packets have strange and unique properties which defy intuition and have us questioning the nature of reality. The implication of several properties including Quantum entanglement may hold a great opportunity for computers. (Chuang & Nielsen, 2000)
Normally, a classical computer (which is what is used today) uses bits to describe and process data. A ‘1’ indicates on and a ‘0’ indicates off. All the mathematics done by a classical computer is based on this fundamental idea. While this has worked well for processing from giant tape computers to the evolution of microprocessors, it is possible that a limit of how small transistors can get is being reached. (Kumar, 2015) One of the reasons for this is that electrons, the tiny negative charged particles that determine a pulse of electricity for processing, seem to ignore guidance past a certain size point. This raises issues for major processing die companies such as Intel and AMD with regards to competition; if there is a limit, something drastic must be done.
A Quantum computer uses something known as a qubit, which is different because according to what is known as its spin, can cause a different output. This inherent structural difference is what creates more possibilities for problem solving and easier algorithm creation, since these computers can skip many of the steps required by classical computing.
While many devices have already been built using this idea (Fitzsimons, Rieffel, & Scarani, 2012) a commercial disruption of the current technology industry would be incredibly beneficial. This new paradigm has potential to solve problems exponentially faster. Many issues with implementation have been discussed, such as ‘decoherence’ which is an unwanted interaction. (Preskill, 2012) This new form of computing would have drastic impacts on the world of simulation, as it is believed that classical computers are not capable of efficiently processing these situations.
Giant steps forward in technology have recently been seen in virtual reality and self-driving cars, but these are both based on classical computing. A fundamental paradigm shift in information processing could have massive impacts for commercial, military, security, and the regular person’s use of both applications, the Internet, and more.
If these computers are better, why haven’t they hit the market yet? Besides the limitations still being researched, it is likely more profitable for companies to stick with the status quo of transistor-based architecture rather than spend more money on an experimental idea. However, it must be inevitable if the limitations of Moore’s law hold true, and the breakthrough technology will be a great reward for the company to make the first bold step.
References
Chuang, I. L., & Nielsen, M. A. (2000). Quantum Computation and Quantum Information. Cambridge University Press. Retrieved 2 11, 2018
Fitzsimons, J. F., Rieffel, E. G., & Scarani, V. (2012). The Quantum Frontier. arXiv: Quantum Physics. Retrieved 2 11, 2018, from https://arxiv.org/abs/1206.0785
Kumar, S. (2015). Fundamental Limits to Moore's Law. arXiv: Mesoscale and Nanoscale Physics. Retrieved 2 11, 2018, from https://arxiv.org/abs/1511.05956
Preskill, J. (2012). Quantum computing and the entanglement frontier. arXiv: Quantum Physics. Retrieved 2 11, 2018, from https://arxiv.org/abs/1203.5813
Comentarios