Quantum computingÂ is a type ofÂ computationÂ whose operations can harness the phenomena ofÂ quantum mechanics, such asÂ superposition,Â interference, andÂ entanglement. Devices that perform quantum computations are known asÂ quantum computers. ^{}^{}Though current quantum computers are too small to outperform usual (classical) computers for practical applications, larger realizations are believed to be capable of solving certainÂ computational problems, such asÂ integer factorizationÂ (which underliesÂ RSA encryption), substantially faster than classical computers. The study of quantum computing is a subfield ofÂ quantum information science.

There are several models of doing quantum computation with the most widely used beingÂ quantum circuits. Other models include theÂ quantum Turing machine,Â quantum annealing, andÂ adiabatic quantum computation. Most models are based on the quantum bit, or “qubit”, which is somewhat analogous to theÂ bitÂ in classical computation. A qubit can be in a 1 or 0Â quantum state, or in a superposition of the 1 and 0 states. When it is measured, however, it is always 0 or 1; theÂ probabilityÂ of either outcome depends on the qubit’s quantum state immediately prior to measurement.

Efforts towards building a physical quantum computer focus on technologies such as transmons, ion traps and topological quantum computers, which aim to create high-quality qubits.Â There are currently a number of significant obstacles to constructing useful quantum computers. It is particularly difficult to maintain qubits’ quantum states, as they suffer from quantum decoherence and state fidelity. Quantum computers therefore require error correction.^{}^{}

Conversely, any problem that can be solved by a quantum computer can also be solved by a classical computer, at least in principle given enough time. In other words, quantum computers obey the Churchâ€“Turing thesis. This means that while quantum computers provide no additional advantages over classical computers in terms of computability, quantum algorithms for certain problems have significantly lower time complexities than corresponding known classical algorithms.

**History**

Quantum computing began in 1980 when physicistÂ Paul BenioffÂ proposed aÂ quantum mechanicalÂ model of theÂ Turing machine.^{}Â Richard FeynmanÂ andÂ Yuri ManinÂ later suggested that a quantum computer had the potential to simulate things aÂ classical computer could not feasibly do.Â ^{}^{}In 1986 Feynman introduced an early version of theÂ quantum circuitÂ notation.^{}Â In 1994,Â Peter ShorÂ developedÂ a quantum algorithmÂ for finding theÂ prime factorsÂ of an integer with the potential to decryptÂ RSA-encrypted communications. ^{}In 1998Â Isaac Chuang,Â Neil GershenfeldÂ and Mark Kubinec created the first two-qubit quantum computer that could perform computations. Despite ongoing experimental progress since the late 1990s, most researchers believe that “fault-tolerant quantum computing [is] still a rather distant dream.”Â ^{}In recent years, investment in quantum computing research has increased in the public and private sectors.^{}^{} On 23 October 2019,Â Google AI, in partnership with the U.S.^{}

A December 2021 McKinsey & Company analysis states that “..investment dollars are pouring in, and quantum-computing start-ups are proliferating”.

**Quantum circuit**

**Definition**

The prevailing model of quantum computation describes the computation in terms of a network ofÂ quantum logic gates.^{}Â This model is aÂ complexÂ linear-algebraicÂ generalization ofÂ boolean circuits.

**Quantum algorithms**

Progress in finding quantum algorithms typically focuses on this quantum circuit model, though exceptions like the quantum adiabatic algorithm exist. ^{}There are too much to know about Quantum computing.