If you’ve been keeping up with tech news, you’ve likely seen the buzz around quantum computing. Microsoft’s latest breakthrough in this space has made recent headlines, particularly for discovering a new phase of matter (more on what this means below). But should we care or is this science fiction?
One reason I’m taking this more seriously is self-driving cars. For many years, I thought of these as perpetually “five years away.” Then, suddenly, they were here. Now Waymo does 150K autonomous rides per week and is expanding rapidly. Even though, it’s a completely different technology, I wonder if quantum computing will be similar where it seems like a wild experiment right up until it starts having an impact.
For AI investors, the quantum computing question is far from academic even if it is probably a decade away. If it works, it could enable a dramatic increase in model performance and change the way we think about AI infrastructure. Massive GPU data centers might no longer be needed.
But, first let me attempt to explain what this is all about.
Solid, Liquid, Gas, and Topological Superconductor(?)
The biggest headline for Microsoft’s new discovery was that they created a new phase of matter (you probably know solid, liquid, and gas) called topological superconductor. It’s worth mentioning that before this announcement there were already lots of weird phases of matter discovered in labs such as Bose-Einstein condensates, superfluids, and (I’m not making this up) Time Crystals.1
The basic idea of quantum computing is that instead of using bits that are 0s and 1s like we do today, we use something called a qubit that uses weird quantum mechanics to potentially take on multiple different values between 0 and 1. This enables massively parallel processing in a way that’s not possible with today’s computers, and problems that today are very hard for computers like factoring large numbers suddenly become easy. The technical details aren’t important – just remember that quantum computers can do a large number of calculations at once.
One of the challenges of quantum computing is the issue of decoherence. If you know one thing about quantum mechanics, it’s probably the Uncertainty Principle – the idea that if we observe a quantum object, we change it. This is a problem for quantum computers that can cause errors to be introduced. Previous quantum computers use complex error-correction to fix things. The big breakthrough for the Microsoft chip is that it is much more robust against errors.2 That means it will be significantly easier to use as a building block for large quantum systems.
QuantumAI
If I haven’t lost you yet, I will now explain the implications of this for GenAI. You are probably aware that GPUs are the preferred chips for GenAI training and inference. The reason is that GPUs are very good at parallel processing - they can handle thousands of calculations at once. At the core of the LLM models, there are a bunch of complex matrix multiplications and other calculations where that parallel processing speeds things up a lot.
If you think back three paragraphs, you might remember that quantum computers are also really good at parallel calculations, potentially orders of magnitude more efficient than GPUs.3
If we can successfully build scalable, error-resistant quantum computers—like those that might be enabled by Microsoft’s new approach—the potential impact on GenAI could be enormous. Here’s how:
Drastically Faster Computation:
With quantum computers, tasks that currently require extensive GPU-based computation could be performed much faster. This could mean instant response times for even very complex queries. AI “thinking” might become incredibly fast, which would enable the bots to explore vastly more options before answering.Reduced Reliance on Massive GPU Farms:
Today’s GenAI models often need large data centers filled with GPUs, consuming vast amounts of power and generating considerable heat. If quantum computers can handle certain types of computation more efficiently, we might see a shift away from these enormous GPU farms. Instead, quantum processors—if scaled up and made stable—could take over some of the heavy lifting in AI tasks. Of course, this creates a different set of infrastructure challenges since quantum computers must be kept at temperatures very close to absolute zero at all times. GPUs wouldn’t disappear overnight, but over time, the computational load could be shared between classical and quantum systems, significantly reducing energy needs. Again, this shift is probably 10+ years away.Enhanced Problem-Solving Abilities:
Quantum computing could open up new avenues in how we design and train AI models. There are a bunch of math tasks that happen in these models4, and if we speed those up dramatically, it could lead to a new generation of more advanced generative models with capabilities way beyond today’s bots.
How Close Are We? Signs to Watch For
While the promise of quantum computing in AI is very exciting, let’s be clear: we’re probably a decade away from seeing this technology even augment the current GPU-based systems. Here are two things to watch for in headlines to see if we’re getting close:
Scaling Up Qubit Count:
Right now, most quantum devices have tens or maybe a few hundred qubits. (The Microsoft one has 8.) For a quantum computer to tackle GenAI problems we will need hundreds of thousands to millions of qubits. Most announcements in the space include the qubit number, so take a look at that to see how close we are. Until we get to 100K, we still have a ways to goReal-World Quantum Advantage:
Google demonstrated “quantum supremacy” in 2019 when they solved a problem that would’ve taken a traditional computer 10,000 years in 200 seconds on their quantum computer. Unfortunately, this was a made-up problem with no practical application. Once someone does that with a real-world problem such as optimizing logistics, running complex simulations, or even handling specific tasks in AI training, then we’ll know that quantum computing is ready to use.
Conclusion
In short, Microsoft’s breakthrough is an important step forward in Quantum Computing. Their new phase of matter is less likely to get errors, which means it could be the building block for much larger and more complex quantum systems. That said, it will take a while to get from 8 qubits to 1 million qubits. But, with this breakthrough, it gets more likely that it will happen eventually.
Whenever Quantum Computing happens, it will be a big deal for GenAI, potentially enabling dramatically more powerful models. Could Quantum Computing be the missing piece for AGI? Maybe. It will also be a big deal for AI data centers as they will need to rip out many, many racks of GPUs and replace them with cryostats to keep their quantum chips at near absolute zero.
For investors and AI enthusiasts, pay attention to this space, especially if you are contemplating long term infrastructure investments that could be impacted by a shift towards quantum computing. Although if quantum computing enables super powerful AI, that may be the least of our worries!
These have only been created in labs and aren’t stable yet and have nothing to do with GenAI. If you are curious, I asked ChatGPT to explain it to me like a fifth grader: “Imagine you have a toy that moves in a special, repeating dance all by itself—even when you aren’t touching it. Normally, if nothing is pushing something, it just sits still. But a time crystal is like a magic clock that never stops ‘ticking’ or dancing, even when it's not being powered by anything extra.” Um, okay.
As ChatGPT put it, “In simpler terms, the qubit’s data is encoded in a subtle, non-local property of the material, which external noise finds much harder to disturb. This gives the qubit a built-in form of error protection at the hardware level.”
Specifically, classical algorithms take n^3 time and quantum algorithms are something like (log n)^2. If you are doing 1000 operations, that’s the difference between 1M calculations and 9 calculations!
As an example, models often pull from complex, high-dimensional probability distributions, which takes up a lot of calculations. Quantum computing could do that really fast.