Chasing Qubits: A New Era of Computing

The Quantum Leap That Might Just Rewire Reality

🤏A little Context

The world has long demanded computing power, from bits to bytes, and beyond. For decades, Moore’s Law observed the number of transistors able to be fabricated in an integrated circuit double approximately every two years. But the curve has started to flatten, and transistors in modern integrated circuits have approached just tens of nanometres across. At this scale, the laws of classical physics that govern their function start to break down, and Moore’s law, which inspired an unwavering confidence to pump billions into technologies requiring exponential growth of computing power, faces a CREDIBILITY CRISIS.

We are approaching the physical limits of classical computing, where further miniaturisation causes unpredictable quantum effects and skyrocketing manufacturing complexity. The focus is SHIFTING. From what can be achieved via the laws of classical physics, to what might be achievable via the laws of quantum physics, its continuously changing. Can we build a quantum computer, how would that work, but fundamentally, what does ‘quantum’ even mean?

🔬 Quantum Physics

‘Quantum’ comes from Latin, meaning ‘how much’, and was used in physics to describe the concept that some properties come in discrete amounts, contrary to the classical idea of smooth variation. The idea being that fundamental concepts such as light and energy come in tiny, indivisible amounts.

In everyday life, we observe and treat light as a continuous, varying phenomenon, describable by classical physics. However, in 1900, Max Planck proposed that light existed as discrete particles (now known as photons) in his work on blackbody radiation. Later, in 1905, Albert Einstein reaffirmed this theory to explain the photoelectric effect. Slowly, a revolution was incited within the physics community, and quantum physics was born - behaviour on such a small scale that classical laws no longer apply. Over time, ‘quantum’ became shorthand for strange, non-intuitive phenomena like superposition, entanglement, and quantisation. These concepts are perhaps most familiar as sci-fi buzzwords, despite being the very phenomena at the heart of quantum computing.

🎻💻 Computing: Classical vs Quantum

Classical computers are networks of transistors, tiny switches flicking on or off, represented by ones or zeroes. This endless sea of bits, whilst a popular nod to some serious background processing, is a code for the physical state of a complex electrical circuit. Each switch, whilst tinier than a dust particle, is gargantuan on the scale of fundamental particles, enabling millions of electrons to surge through a lattice of semiconductors. Put simply, classical computing follows the rules of classical physics, a predictable realm where a bit can confidently exist as either a one or a zero.

However, the constant miniaturisation of transistors has reached the atomic scale, where classical computing systems become unreliable. Moreover, certain processes, like simulations and encryption, remain fundamentally difficult for classical computers, requiring huge numbers of bits, and crippling processing speeds.

Quantum computers don’t use bits. They don’t contain transistors. They don’t even follow the laws of classical physics. Rather, they’re an ever-evolving concept, with many technologies competing for dominance, each harnessing different physical properties. But what they all share is that in place of bits, they use quantum bits - affectionately shortened to qubits - which exist what’s known as a superposition: a weighted combination of both one and zero until measured.

There are many candidate technologies for qubits; from electron spins to trapped ions to photons. But all aim to encode information using quantum systems, where the rules allow for vastly richer behaviour.

⚛️ Quantum Systems

This is where we come full circle, as quantum physics is key to understanding what a qubit really is, and how we can picture it as a tangible concept. Light was found to come in indivisible packets of energy, marking its migration from classical to quantum physics, and establishing photons as having the ability to become part of a quantum system.

This is a physical system, but on such a small scale that it can exist in two or more distinct quantum states, that can reliably be manipulated and read out by a quantum computer. Whilst quantum systems exhibit discrete, well-defined states, corresponding to a one or zero, they are engineered in a way that we don’t know which state they’re in, and that’s exactly the point.

If we knew what state the computer was in at all times, it wouldn’t be quantum, it would be deterministic. Instead, we look at the situation as probabilistic, representing the quantum system as it’s ‘quantum state’, where the probabilities of the definitive states incorporate probability amplitudes, to reflect the system’s likelihood of being in each state. These are influenced by the conditions we apply to the system. However, the defining feature that makes quantum computing so powerful is that multiple qubits can be entangled, linking their quantum states, and allowing the computer to consider many possibilities in parallel, as opposed to classical computing which deals with tasks sequentially.

🏭 Quantum Industry

But these quantum systems aren’t well-established, and they aren’t widely useful... yet. They are being pursued through radically different approaches, and many companies are quick to promise their qubit architecture as the most viable.

Big names like IBM, Google, and Intel are all investing heavily in superconducting qubits. New quantum-focused companies such as IonQ and Quantinuum are developing trapped-ion systems, whilst PsiQuantum are aiming to bring photonic qubits to the forefront. Each approach has its strengths, such as higher scalability or smoother integration with existing technologies.

We are living through the early days of what many call the ‘quantum race’, reminiscent of the infamous space race of the mid-20th century. But instead of vying whose flag gets to drift in near-orbit, the quantum race dives into the jaws of capitalism, chasing leadership in a 1.42 billion USD industry (as of 2024), expected to grow to 4.24 billion USD by 2030.

Those figures may seem substantial but considering the 681.05 billion USD leviathan that is the global semiconductor industry, quantum computing is still up-and-coming. However, its potential is massive, threatening to potentially outperform classical computing on key tasks including optimisation, cryptography, and machine learning.

🎯 Final Take

Quantum computing is not science fiction. It’s the cutting edge of physics, the covert operation of tech giants, and the headache of their R&D teams. It is almost unbelievable to me that the computing industry as we know it can be overhauled, and it’s exciting to see another application of physics migrating from classical to quantum physics.

Whether it redefines computing as we know it or settles into the role of a specialised co-processor, quantum computers are in the spotlight. But will they be the answer to the credibility crisis facing Moore’s law, and continue our exponential growth of computing power?

Thanks for reading,

Adam Garon