Imagine mailing a letter. You, as the person sending the letter, know what the letter says. But the situation is different for the person you’re mailing the letter to. Until they read it, they generally won’t know what it says.
This is the way scientists think about information, at least in the classical sense.
A computer stores information, sends and receives information, and processes information. In a classical computer, the information travels in the form of a string of bits—a pattern of 1s and 0s. As each bit arrives, the recipient doesn’t know what value it will have; from their point of view, it is just as likely to be a 0 as it is to be a 1. To be sure, it will definitely be one or the other, but which it is will be revealed only once it arrives.
In this sense, upon its arrival each bit resolves a certain amount of uncertainty.
Nature is not perfectly predictable.
Now, it could be that knowing the start of a message gives clues about the rest of it. If a message starts, “O Romeo, Romeo,” it’s a good bet the message will conclude, “wherefore art thou Romeo?”
Still, knowing the first part of the message does not determine—perhaps more to the point, it does not affect—the next part of the message. It could be that the rest of the message is “could you get me a sandwich?”
All of this makes sense because classical information follows a set of rules.
Quantum information breaks those rules, making it at once a powerful basis for computing and an exquisitely fragile beast.
The quantum difference
The rules of classical information are so intuitive that they are easy to take for granted.
First, classical information is discrete: A bit is always either a 0 or a 1, and nothing in between. Second, bits are deterministic. That is, to the extent there is uncertainty in a bit, that uncertainty exists in the mind of someone who has not yet received a message (or in the possibility that an error might change the value of the bit). Finally, classical information is local—as in the Shakespeare example, a bit may suggest what’s coming, but observing that bit doesn’t actually affect any other bits.
Quantum information, on the other hand, is not discrete. A classical bit is definitely a 0 or a 1, but a quantum bit, called a qubit, can be a bit of both.
This feature allows a qubit to carry a different kind of information: continuous information about the relative balance of 0 and 1 within the qubit. Quantum algorithms can sometimes use this fact to run more efficiently than their classical counterparts.
Quantum information is also not deterministic. When someone takes a look at a classical bit, it simply is a 0 or a 1, as it was beforehand and as it will be afterward, apart from the possibility of error. Not so with qubits, which are affected by the measurement.
Although the qubit can be in any mix of 0 and 1, measuring it—as one would need to do to read the output of a calculation—forces it to be either 0 or 1. In general, there is some chance the answer will come out 0 and some complementary chance it will come out 1. This is not an error, and it is not the same as a message recipient simply not yet knowing the bit’s value—it is a fundamental feature of quantum physics.
Importantly, this feature also means that reading the output of a quantum computer—a kind of measurement—destroys most of the information it stores. Where once there was a superposition, measurement makes it so that all that’s left is a 0 or a 1.
Finally, quantum information is not local. While each classical bit is independent of every other bit, a qubit is typically not independent of other qubits.
For instance, engineers can prepare a pair of qubits in a state such that if we measured one qubit as a 0, the other would have to be a 1, and vice versa. In theory, engineers can build up systems with as many qubits as they want, where each qubit’s state depends on many other qubits’ states, and all are part of a complex entangled system.
This observation has a curious consequence: Where classical bits store information locally and independently of each other, quantum information is typically stored in the relationships between individual qubits.
The upside of quantum information
Quantum superposition, measurement and entanglement introduce certain difficulties. For instance, there are more ways for errors to creep into the system. And quantum information has to be carefully protected from its environment, lest it become entangled with that environment and effectively lost. Quantum error correction is in turn more challenging, since a problem that affects one qubit can end up corrupting the entire system.
But quantum information brings with it some remarkable advantages as well, and these advantages are big enough to make it worth solving the challenges that arise.
One early argument for quantum computing goes something like this: Classical computers are deterministic things—that is, when they perform a calculation, they produce only one answer. Nature, on the other hand, is not perfectly predictable. Since some aspects of it are fundamentally quantum mechanical, nature can produce more than one answer. That means a classical computer is going to have a hard time simulating quantum behavior.
Imagine using a classical computer to simulate a single qubit. At a bare minimum, a classical computer would need many bits to describe what state the qubit was in, since the qubit could be in any combination of the 0 and 1 states. A classical computer would need still more bits to encode how qubits are entangled with each other, and even more to simulate what happens when someone performs a quantum algorithm and measures the output.
In other words, it takes a lot more than 10 classical bits to simulate 10 quantum bits, suggesting that one might be able to do a lot more with 10 quantum bits than one could with 10 classical bits.
But even that thought experiment doesn’t fully capture the distinction. There isn’t simply more information in a quantum bit—quantum superposition, measurement and entanglement also mean that the way we process and interact with quantum information is fundamentally different.
One consequence is that quantum computers could be better than classical computers even when it comes to solving some deterministic problems. A now-classic example is factoring, or finding the prime numbers that multiply together to make another number. While there is only one way to factor any number, factoring large numbers is a very hard problem on classical computers. On a quantum computer, it’s relatively easy.
These distinctions don’t mean that quantum computers are better than classical computers at everything. The main point is that they are different and therefore suited to solving different kinds of problems, and indeed researchers are working hard to understand which computational problems quantum computers would be best suited to. What’s clear is that quantum information opens up new possibilities, and the future is still unwritten.