17 Comments
User's avatar
Ryan R. Campbell's avatar

Great explainer! Thanks for taking the time to break it all down; so many of the other articles I've encountered about this topic seem to be just relaying what they've read elsewhere with little care given to actually explaining the details and what the consequences for these advances might be.

Expand full comment
Meg McNulty's avatar

Appreciate that! Quantum computing is moving fast, but the conversation around it is too often either too high-level or overly technical. My goal is to bridge that gap—explaining not just what’s happening, but why it matters. Glad you found it useful! What aspects of quantum’s future are you most interested in?

Expand full comment
Aravind Narayan's avatar

Putting the cart before the horse.

The mathematics of quantum error correction to minimize decoherence is a long ways away from comprehensive applications .

Expand full comment
Meg McNulty's avatar

Hey Aravind! Super appreciate the response. I don't completely agree - imo real-world use cases don't only depend on fault-tolerant quantum computing. even noisy quantum devices have value today, particularly for optimization, chemistry simulations, and near-term hybrid approaches that combine classical and quantum computing

Expand full comment
Aravind Narayan's avatar

Nah…..these noisy quantum devices you're talking about are mostly worth shit when it comes to continuous quantum error correction to mitigate decoherence.

Works at a very rudimentary level, but that's about it.

Ya ..the hybrid stuff like D wave etc can be used for limited purposes in the domain of chemistry, and some non linear optimization applications that are meaningful….but you're overstating it …at least for now. Im sure we'll make headway, but it may not be how you envision it.

Expand full comment
Ryan R. Campbell's avatar

At this stage, I'm largely just hoping to stay in the loop. Like you point out, the expeditious movement in research and the leaps forward as a result will surely continue to come in at breakneck speed in the coming months and years.

Ultimately, though, I'm interested in what this will look like for everyday consumers. Like you mention, the applications in healthcare and logistics stand out as opportunities for improvements in the lives of everyday people, but I wonder how these advances might eventually be taken advantage of directly in consumer products.

Maybe this type of computing ends up having utility in the on-the-horizon market for in-home robotics? Perhaps we're a long ways off from that, but it is something I'll be keeping an eye on.

Expand full comment
Meg McNulty's avatar

A great call-out! Quantum’s impact on consumers is still a big unknown, but the pieces are starting to come together. SK Telecom’s partnership with IonQ (ann. Feb 27) is a good example—it's integrating quantum into networking and security, including quantum-safe encryption and optimizing AI-driven services. That’s the kind of groundwork that could eventually shape consumer tech.

AI and quantum are already overlapping in areas like logistics and drug discovery, but longer-term, we could see them power faster networks, ultra-secure communications, and even smarter automation in consumer devices. It’s still early, but moves like this partnership show that quantum isn’t staying locked in research labs. The big question is when it’ll hit that tipping point and start influencing the tech we interact with daily. Definitely worth watching.

Expand full comment
No Soy Seldon's avatar

Thanks, great read. But i still don't get what a quantum computer does or tries to do and how . Any recommendation to keep reading about it ?

Expand full comment
Meg McNulty's avatar

Hey, so glad to hear you enjoyed it!

For further reading (rather than just news bits), I like to recommend:

- “Quantum Computing for Everyone” by Chris Bernhardt – a solid, accessible intro

- IBM’s Quantum Computing Blog – great for practical insights

- MIT’s Quantum Computing Course on OpenCourseWare – if you want more depth

And my short tl;dr- a quantum computer processes information in fundamentally different ways than classical computers, leveraging superposition (where qubits exist in multiple states at once) and entanglement (where qubits are correlated no matter the distance). This allows quantum computers to solve certain complex problems exponentially faster than traditional computers.

Let me know if you’re looking for something more specific!

Expand full comment
JibbatheHut's avatar

Do we know why majorana particles tend to be interference free?

The majorana chip fits in the hand, do all the other kinds of chips also fit in the hand? Or does this chip still belong to a massive chandelier looking computer?

Expand full comment
Meg McNulty's avatar

Good questions!

1/ Majorana particles' quantum state naturally splits between two locations in a system. This happens in certain superconducting environments, where instead of existing as a single, well-defined particle, they appear as two halves that are physically separated. Since their full state only exists when considering both halves together, local disturbances don’t easily affect them, making them more stable against interference.

2/ Most quantum computers today—like those from IBM, Google, and Rigetti—use 'superconducting qubits,' which require the massive chandelier-like dilution refrigerators to keep them near absolute zero. These systems don’t use “chips” in the classical sense; instead, they rely on fabricated qubit circuits on specialized substrates housed within those cooling systems. While MSFT's Majorana chip itself may be small and fit in the hand, it still depends on external hardware for cooling and control. It’s unclear if Microsoft has fully moved beyond the chandelier setup, but any practical quantum system still needs significant infrastructure to operate.

Some alternative approaches, like photonic quantum computing (PsiQuantum, Xanadu) or trapped-ion systems (IonQ), aim for more scalable architectures that could avoid the chandelier entirely. But as of now, quantum computing is still far from a single-chip, room-temperature device.

Expand full comment
Feisal Nanji's avatar

Good , crisp explanation of the various approaches to QC. Question is how long before MSFT can sell this . It looks ar least several years away .. your take ?

Expand full comment
Meg McNulty's avatar

Definitely several years minimum... I will say they expedited the timeline. Many of us were thinking ~10-20 years, this has cut that to... 6-15? Selling it at scale, of course, becomes its own bag of worms beyond fault-tolerance, in terms of commercial applications. What we need how is builders creating hybrid (classical+quantum) approaches to help enterprises and govts get excited about the use cases. Then, mass production at a commercially viable cost is a whole other challenge (incl. securing rare materials, and developing a supply chain that can reliably produce and maintain quantum hardware).

Expand full comment
Pablo's avatar

Too much hype.

You have to start with explaining what algorithms outside of Shor’s and maybe Grover’s benefit from quantum computing approach. I’ve seen dozens and dozens of articles purporting future huge benefits without mentioning which algorithm will be used. You don’t need QC to identify the flow.

Second, if Microsoft material is so good as to provide stable qubits - it will have a much harder time to establish entangled states without which known algorithms do not work. So, we go back to square one.

Expand full comment
Meg McNulty's avatar

The skepticism is fair—there’s been a lot of quantum hype with little focus on actual algorithmic advantage beyond Shor and Grover. But algorithms like QAOA and VQE are already showing promise in optimization and chemistry, even on noisy devices. And especially when combined with/supported by classical compute. As for MSFT's approach - it’s a fair critique, but it assumes Microsoft hasn’t already considered this tradeoff. Their bet is that topological qubits will enable better entanglement at scale once the system is fully realized. Whether they’re right or not is still an open question, but saying we’re “back to square one” ignores the fact that real-world progress isn’t binary. Either way, worth watching!

Expand full comment
Pablo's avatar

The back to square one comment was about challenge’s with entanglement of a large number of qubits making system-level compute not feasible. I haven’t seen the precise explanation on how they plan to do with with “topological quints” - catchy name that smells of corporate need to be different. How do they control state superposition? What’s fundamental mechanism for entanglement?

Expand full comment
Ken Arakelian's avatar

What useful work can a quantum computer do today faster than a conventional computer?

Expand full comment