Changing the core of computing
It’s been decades since Richard Feynman first proposed the possibility of using quantum mechanics for computing and urged the world to move forward in that direction. Yet, even for all the time that the concept of quantum computing has been among us, the general public is still in the dark about what it implies and what it could mean for our future.
That lack of knowledge is probably tied to the fact that quantum computing, for the most part, has always been worked on a theoretical level. It didn’t help, either, that quantum computers were being developed with military goals in mind, which practically obliged all of its advantages from getting more notoriety.
Be that as it may, recent years have begun putting quantum computing at the center of the scene. That’s because we’re reaching certain physical and technological limitations in our traditional transistor-based computing systems. So, in order to keep enhancing our computers, we need to change the very core of computing. That’s when quantum mechanics makes its appearance.
A tipping point in computing history
To truly understand what quantum computing might bring, it’s important to understand our current computing system and some of its basic concepts. For instance, you’ve probably heard about the binary system used by computers to carry out their normal tasks.
In case you don’t know what that is, it’s enough to know that computers use transistors to store and retrieve data. These transistors basically work as on-and-off switches that are combined among themselves to create a series of predefined instructions that make up a particular program.
Transistors can only “exist” in one of two available positions (on or off, 0 or 1). Each one of these positions is called a bit, and their combinations make up all the files and programs you use on a computer. Transistors control the passage of electronic signals. If a signal passes through the logical gate of the transistor, it becomes a bit 1, if it doesn’t, it’s a 0.
This basic model has served amazingly well for the last few decades. Thanks to the miniaturization of electronic devices and their components, computers grew more powerful, adding millions of transistors that can be as small as 7nm (that’s a 1000 times smaller than a red blood cell!). That worked well for a lot of time – but we’re reaching the point where we won’t be able to develop smaller transistors.
The reason is obvious: there’s a physical limitation that renders transistors useless. Beyond that minimum size threshold, the electronic current would just bypass the transistors as if they weren’t there. In other words, transistors would be so small that they wouldn’t be able to act as logic gates. Therefore, there would be no 0s and 1s, and no binary system.
This poses a challenge. We could just reconcile with the devices we have right now, a balance between small size and processing power. We could also keep adding transistors that are as small as possible – but that would mean that more powerful computers and devices than the ones we have today would have to be bigger to accommodate those extra transistors. Finally, we could replace the binary system altogether and try a new concept, which is precisely what quantum computing is offering.
Thus, we’ve come to a tipping point in computing history that needs a decision. And it seems that we’re taking the leap into quantum computing. But how does it work?
Instead of using bits to represent data, quantum computing uses qubits. These are represented through quantum particles such as electrons and photos. These particles have specific properties that can be used to represent data (either a 0 or a 1). For instance, qubits can spin in a specific direction. If it spins upwards, that can be read as 1, while a qubit spinning downward can become a 0.
Now, this all seems the same as the transistor-based system. However, qubits have a unique property that makes quantum computing more powerful – superposition, which allows for a quantum particle to exist in multiple states at the same time. Thanks to it, a qubit can be a 1 and a 0 simultaneously and can remain in that way during computation to only become or the other at the end.
The superposition property is combined with another property to explain quantum computing’s superior performance. The second property is called quantum entanglement and it means that quantum particles get entangled and dependent from one another. Thus, the properties of entangled particles act as a single system – if one qubit state changes, the state of its entangled qubit will also change.
Why is this important? Because it allows for real parallel processing, which basically means that several calculations can be carried out at the same time, saving a lot of time, offering superior performance, and allowing for the solution of more complex operations.
The importance of quantum computing
All of the above paints a scenario that’s detailed enough for you to understand why quantum computers are needed for the future. Technical limitations are preventing us from having more computational power. Why would we need more computational power than we already have, you ask?
Well, there’s a number of benefits that quantum computing can bring to the table. For instance, and given that we’re living in the big data era, analyzing huge datasets and recognizing patterns would be a lot faster through quantum devices. This technology could also be used to train artificial intelligence systems, develop ultra-precise sensors for medical purposes, and boost telecommunications.
The belief that quantum computing will be a game-changer for the entire world led the European Union to launch The Quantum Technologies Flagship. This is a €1 billion initiative that will fund thousands of quantum researches during the next 10 years to deepen the second quantum revolution that will bring major technological advances. Naturally, no one says that getting there will be easy.
Scientists are now struggling to create a controlled environment for the qubits and are looking into ways to modify the quantum particles’ properties to get the desired results. But the work is being done and development is moving forward. In fact, there are people that believe that quantum computers will be commercially viable in 2040. That might seem like a far off future but since a quantum computer can radically change the way we use computers, preparing for this scenario is a must.
As a matter of fact, the National Institute of Standards and Technology (NIST) of the United States is already working in one of the most troublesome aspects of quantum computing – security. Since the computational power brought by these computers would be immense, our current encryption technologies would be rendered obsolete in no time. That’s why the NIST is already working in post-quantum cryptography to ensure that the transition doesn’t produce unexpected results in sensible areas like data management and storage.
Quantum computing represents a huge leap forward for our current computing possibilities. With quantum devices, we’ll be able to calculate, simulate and analyze all kinds of datasets and possibilities in a couple of minutes that would take centuries to look into with our traditional computers. However, such power comes with its challenges that are still being researched to fully grasp their effect.
We are still some years away from enjoying this kind of devices and it’s hard to predict the impact it might have in our daily lives. It’s worth taking a look at quantum computing, nonetheless, since the future seems like it’ll be tied to its development – and the changes it may bring promise to be staggering.