For most of our history, human technology consisted of only the brain, fire, and sharp sticks. While fire and sharp sticks have eventually turned into power plants and nuclear weapons, the biggest changes are happening to our brains. Since the 1950s, our intelligence has been growing exponentially, allowing our computers to become smaller and more powerful simultaneously. However, this process is approaching its limit, as computer parts are approaching the size of an atom. To explain, first we have to know how a computer works.
The Basics of a Computer
Computers are based on something called logic gates that store electricity inside. The transferring of electricity within these logic gates creates signals that can be detected. It is this that computers use to transfer, calculate, and store data. They themself are then controlled by a system consisting of only two numbers, 1 when electricity is being stored inside the logic gate and 0 when electricity is being released from the logic gate. This system is called a binary system, as it only deals with two numbers, 0 and 1, and these numbers are called bits. One logic gate might seem simple, and you can’t do much about it, but a bunch of these logic gates that store 1 and 0 will turn into a computer that can solve complex problems. So in order to have a more powerful computer, we need more logic gates. That is what computers or classical computers are doing now, minimizing the size of a single logic gate and fitting more of them on a CPU. However, as previously noted, the classical computers are facing a problem, as logic gates are approaching the size of an atom.
What is the Problem?
As we approach the quantum level, the way particles behave is different from the macro level. Electrons that can normally be stored in a macro scale can no longer hold within the logic gates when they are just a few atoms of size; electrons can just transfer themselves to the other side of a blocked passage. You might wonder how this is happening, as Electrons can’t just teleport, right? Well, the truth is that they can, and this process is called quantum tunneling. In quantum physics, everything is probabilistic, meaning they have a chance of being everywhere. In the case of electrons themselves, they can be anywhere at any time. At the quantum level, the electron can sometimes surpass a potential barrier of a logic gate. This action is based on the Heisenberg uncertainty principle, stating that we never know the exact position and momentum an object is at. Is there a solution to this problem? Well, yes. Scientists are now developing a quantum computer, a computer that can operate at the quantum level. So, what is quantum computing, and how does it work?
The Future?
Quantum computing is a new kind of computing that’s been getting a lot of attention. It’s built on the laws of quantum mechanics, the same science that explains how atoms and subatomic particles behave.
We briefly mentioned bits before, which are two numbers 0 and 1, representing the flow of electricity inside logic gates. 1 represents when electricity is being stored inside the logic gate and 0 when electricity is being released from the logic gate. Yet quantum computers are different; they use qubits instead of bits. A qubit can be a 0, a 1, 0 and 1 at the same time, following its own sets of rules. The quantum property that allows the qubit to be both a 0 and a 1 is called superposition. Here the qubit is unknown of its position, because at the quantum level, everything is probabilistic. This is why the qubit is both a 0 and 1 because we will never know the side of a coin while it is spinning. It is when we observe it when it stops that we know its position. That sounds weird, but it’s a real physical phenomenon, and it’s what gives quantum computers their potential power. There’s also another important concept: entanglement. When two qubits are “entangled,” their states are linked. We don’t know why this phenomenon exists, but it is an important property in quantum physics. This means changing one instantly affects the other, even if they’re far apart. This lets quantum computers perform operations that are interconnected, something that classical computers can’t do. With enough qubits, a quantum computer can solve many different problems at once.
Nevertheless, this doesn’t mean quantum computers are overall better. They’re just different. Classical computers nowadays are incredibly fast and reliable. Classical computers are great at handling tasks that we already know how to do efficiently. On the other hand, when encountering problems with things like simulating the behaviors of molecules at the atomic level, or solving complex encryption, classical computers will struggle. These are the kinds of problems that quantum computers are being built to handle. For example, quantum computers are said to be able to break the RSA and encryption method. The RSA encryption methods are the most commonly used way of encrypting information online. They work by generating a public and a private key. A public key allows people to send you information in encryption, while your private key allows you to decode it. This encryption is effective because it would take classical computers millions of years to decode information if they don’t know the private key. However, as quantum computing is brought up, the RSA encryption can be easily solved with an algorithm called Shor’s algorithm.This can be problematic because one can easily use a quantum computer to break into others’ bank accounts or other private information.
Fortunately, building a functional quantum computer isn’t easy. Qubits are very delicate—they lose their quantum state quickly if there’s even a tiny bit of noise or disturbance. It is known as Decoherence, a third property of qubit in the quantum level. where it loses its quantum properties of entanglement or superposition, or sometimes both. That’s why most quantum computers have to be kept really cold it helps them to keep their qubits stable long enough to get anything done.
Still, over the last few years, a few big companies and labs have managed to get their quantum computers up and running, each in their own way.
Progress and Methods That People Use to Approach Quantum Computing
The most widely used method today is superconducting qubits. These are electrical circuits made from materials that conduct electricity without resistance when cooled to cryogenic temperatures (around -150°C or -238°F). International Business Machines Corporation (IBM), Google, Amazon (through Rigetti, a company known for its research in quantum computers), and researchers in China have all built machines using this method. Google made headlines with its Sycamore processor, and IBM is scaling up quickly—their Condor chip has over 1,100 qubits.

Another approach uses trapped ions— ndividual atoms held in place with electromagnetic fields. These ions are manipulated with lasers to perform computations. Companies like IonQ, Quantinuum , and a few university-affiliated startups are leading the way here. Trapped ion systems are known for their precision and stability, though they can be harder to scale.
There are also photonic quantum computers, which use particles of light as qubits. Photons don’t interact with their environment as much, so they’re less likely to lose their quantum7 state. Xanadu (in Canada), PsiQuantum (in the U.S.), and research labs in Japan are pushing this technology. China’s Jiuzhang project has also demonstrated some impressive results using photonics. The Jiuzhang project refers to a series of photonic quantum computers developed by a team at the University of Science and Technology of China (USTC), led by Pan Jianwei. The project is named after the ancient Chinese mathematical text, Jiuzhang suanshu.
Another growing area is neutral atom quantum computing, which traps neutral atoms using lasers (called optical tweezers) and entangles them using special quantum interactions. This approach offers a lot of flexibility in how qubits are arranged and is well-suited for quantum simulation tasks. Companies like QuEra and Pasqal are exploring this method.
Topological qubits are still experimental but very promising. Microsoft is betting on this idea, which involves using exotic particles called Majorana fermions. Fermions are subatomic particles that follow Fermi-Dirac statistics and have a half-integer spin. These spins are what gives the qubits properties like superposition and entanglement. Majorana fermions are theoretical particles that are their own antiparticles. One key difference between typical fermions and Majorana fermions is that Majorana fermions are hypothesized to be identical to their antiparticle counterparts. Their unique property has brought interest from people in their potential use in quantum computing, particularly in building stable qubits. If it works, it could lead to quantum computers that are much more resistant to errors. But for now, it’s still in the research stage.
One more approach worth mentioning is quantum annealing, used by D-Wave (another company that researches quantum computers). The goal of quantum annealing is finding the lowest energy state of a system and through finding the lowest energy state one can solve optimization problems more effectively. It’s a different kind of quantum computer that isn’t universal like the others, but it’s good at solving optimization problems—like figuring out the best delivery routes or minimizing energy use in a network. D-Wave’s machines already have thousands of qubits, but they work differently from the gate-based models most companies are building.
Each of these technologies has trade-offs with some being easier to scale, some more precise, and some are more resistant to noise. This is why there is no clear winner yet. Everyone is racing to figure out what works best, or maybe, what combination of technologies will give us the most useful machines.
In the near future, we’ll likely see hybrid systems of classical computers and quantum computers, where classical computers will do most of the work, but pass the toughest parts to quantum processors. This combination could be a game-changer in fields like cryptography, drug discovery, finance, and material science–places where solving just one tough problem could save years of work or billions of dollars.
Quantum computing is still in its early days. We haven’t yet reached the point where these machines are ready for everyday use. But the progress is real, and the excitement isn’t just hype. Step by step, researchers and engineers are figuring out how to build machines that don’t just compute faster—they compute differently. And that might be exactly what we need to solve the hardest problems of tomorrow.
In the near future, we’ll likely see hybrid systems of classical computers and quantum computers, where classical computers will do most of the work, but pass the toughest parts to quantum processors. This combination could be a game-changer in fields like cryptography, drug discovery, finance, and material science–places where solving just one tough problem could save years of work or billions of dollars.