Quantum computing as a theory has been around since 1981; however, it still reads like science fiction more than three decades later. Several movies in the scifi thriller genre revolved around quantum computers. 

Paul Benioff proposed the first recognisable theoretical framework of quantum computing in 1982. Prof Richard Feynman lectured about ‘Simulating Physics with Computers’, also in 1982. In the last two decades, quantum computing is no longer in the realm of wild imagination but has become the hottest topic in both quantum physics and computer science. 
So, what is quantum computing? 
Quantum computing is a technology based on the principles of quantum theory. Quantum computing harnesses the laws of quantum mechanics to carry out complex data operations. Quantum mechanics pertains to the realm of subatomic particles where the laws of classical physics breakdown. It shows how particles and waves have a dual nature. Particles like electrons tend to behave like waves, whereas light waves also display particle nature. A quantum processor has millions of qubits that explore all possible combinations to find the best answer. A qubit (or quantum bit) is the basic unit of quantum information (quantum version of the classical binary bit). Quantum entanglement (perfect correlation between quantum particles) allows qubits to communicate with each other even if they are miles (or even millions of miles) apart. 
To read more, please subscribe. 