Quantum computing as a theory has been around since 1981; however, it still reads like science fiction more than three decades later. Several movies in the sci-fi thriller genre revolved around quantum computers.

  • Sneakers (1992) foretold a world in which a computer could decode all the computerised classified data in the world. (Two years after the movie came out, mathematician Peter Shor figured out the math that makes cryptography vulnerable to quantum computers.)
  • Minority Report (2002) was creatively portrayed and accurately predictive of future technologies.
  • I, Robot (2004) and Eagle Eye (2008) showed computers with near-human feelings and an ability to hack and control everything.
  • Matrix Trilogy (1999-2003) and Transcendence (2014) created a sentient computer.

Paul Benioff proposed the first recognisable theoretical framework of quantum computing in 1982. Prof Richard Feynman lectured about ‘Simulating Physics with Computers’, also in 1982. In the last two decades, quantum computing is no longer in the realm of wild imagination but has become the hottest topic in both quantum physics and computer science.

So, what is quantum computing?

Quantum computing is a technology based on the principles of quantum theory. Quantum computing harnesses the laws of quantum mechanics to carry out complex data operations. Quantum mechanics pertains to the realm of sub-atomic particles where the laws of classical physics breakdown. It shows how particles and waves have a dual nature. Particles like electrons tend to behave like waves, whereas light waves also display particle nature. A quantum processor has millions of qubits that explore all possible combinations to find the best answer. A qubit (or quantum bit) is the basic unit of quantum information (quantum version of the classical binary bit). Quantum entanglement (perfect correlation between quantum particles) allows qubits to communicate with each other even if they are miles (or even millions of miles) apart.

To read more, please subscribe.