Quantum computers are coming to break our codes faster than anyone expected

Craig Costello, Queensland University of Technology

Online data is generally pretty secure. Assuming everyone is careful with passwords and other protections, you can think of it as being locked in a vault so strong that even all the world’s supercomputers, working together for 10,000 years, could not crack it.

But last month, Google and others released results suggesting a new kind of computer – a quantum computer – might be able to open the vault with significantly less resources than previously thought.

The changes are coming on two fronts. On one, tech giants such as IBM and Google are racing to build ever-larger quantum computers: IBM hopes to achieve a genuine advantage over classical computers in some special cases this year, and an even more powerful “fault-tolerant” system by 2029.

On the other front, theorists are refining quantum algorithms: recent work shows the resources needed to break today’s cryptography may be far lower than earlier estimates.

The net result? The day quantum computers can break widely used cryptography – portentously dubbed “Q Day” – may be approaching faster than expected.

The quantum hardware race

Quantum computers are built from quantum bits, or qubits, which use the counterintuitive properties of very tiny objects to carry out computations in a different and sometimes far more efficient way from traditional computers.

So far the technology is in its infancy, with the major goal to increase the number of qubits that can be connected to work as a single computer. Bigger quantum computers should be much better at some things than their traditional counterparts – they will have a “quantum advantage”.

Late last year, IBM unveiled a 120-qubit chip which it hopes will demonstrate a quantum advantage for some tasks.

Google also recently announced it planned to speed up its move to adopt encryption techniques that should be safe against quantum computers, known as post-quantum cryptography.

Alongside these tech giants, newer approaches are also flourishing. PsiQuantum is using light-based qubits and traditional chip-manufacturing technology. Experimental platforms such as neutral-atom systems have demonstrated control over thousands of qubits in laboratory settings.

In response, standards bodies and national agencies are setting increasingly concrete timelines for moving away from common encryption systems that are vulnerable to quantum attack.

In the United States, the National Institute of Standards and Technology (NIST) has proposed a transition away from quantum-vulnerable cryptography, with migration largely completed by 2035. In Australia, the Australian Signals Directorate has issued similar guidance, urging organisations to begin planning immediately and transition to post-quantum cryptography by 2030.

Algorithms make the lock-picking faster

Hardware is only half the story. Equally important are advances in quantum algorithms – ways to use quantum computers to attack encryption.

Much interest in quantum computer development was spurred by Peter Shor’s 1994 discovery of an algorithm that showed how quantum computers could efficiently find the prime factors of very large numbers. This mathematical trick is precisely what you need to break the common RSA encryption method.

For decades, it was believed a quantum computer would need millions of physical qubits to pose a threat to real-world encryption. This is far bigger than current systems, so the threat felt comfortably distant.

That picture is now changing.

In March 2026, Google’s Quantum AI team released a detailed study showing that far fewer resources may be needed to attack a different kind of encryption which uses mathematical objects called elliptic curves. This is what systems including Bitcoin and Ethereum use – and the study shows how a quantum computer with fewer than half a million physical qubits may be able to crack it in minutes.

That’s still a long way beyond current quantum computers, but around ten times less than earlier estimates.

At the same time, a March 2026 preprint from a Caltech–Berkeley–Oratomic collaboration explores what might be possible using neutral-atom quantum computers. The researchers estimate that Shor’s algorithm could be implemented with as few as 10,000–20,000 atomic qubits. In one design they propose, a system with around 26,000 qubits could crack Bitcoin’s encryption in a few days, while tougher problems like the RSA method with a 2048-bit key would need more time and resources.

In plain terms: the codebreakers are becoming more efficient. Advances in algorithms and design are steadily lowering the bar for quantum attacks, even before large-scale hardware exists.

What now?

So what does this mean in practice?

First, there is no immediate catastrophe – today’s cryptography won’t be broken overnight. But the direction of travel is clear. Each improvement in hardware or algorithms reduces the gap between current capabilities and useful quantum cracking machines.

Second, viable defences already exist. NIST has standardised several post-quantum cryptographic algorithms which are believed to be resistant to quantum attacks.

Technology companies have begun deploying these in hybrid modes: Google Chrome and Cloudflare, for example, already support post-quantum protections in some protocols and services.

Systems that rely heavily on elliptic-curve cryptography – including cryptocurrencies and many secure communication protocols – will need particular attention. Google’s recent work explicitly highlights the need to migrate blockchain systems to post-quantum schemes.

Finally, this is a two-front race. It is not enough to track progress in quantum hardware alone. Advances in algorithms and error correction can be just as important, and recent results show these improvements can significantly reduce the estimated cost of attacks.

Every new headline about reduced qubit counts or faster quantum algorithms should be understood for what it is: another step toward a future where today’s cryptographic assumptions no longer hold.

The only reliable defence is to move – deliberately but decisively – toward quantum-safe cryptography.The Conversation

Craig Costello, Professor, School of Computer Science, Queensland University of Technology

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More........

Australia’s supercomputers are falling behind – and it’s hurting our ability to adapt to climate change

Christian Jakob, Monash University

As Earth continues to warm, Australia faces some important decisions.

For example, where should we place solar and wind energy infrastructure to reliably supply Australians with electricity? How can we secure our food production and freshwater supply? Should we invest in bigger dams to increase our resilience to drought, or better flood mitigation to manage more intense rainfall?

Deciding on the best path forward depends on having reliable and detailed information about about how wind, water and sunlight will behave in our future. This information is provided by climate models, large computer simulations of Earth that are based on the fundamental laws of physics and contain everything from the Sun’s radiation, the carbon cycle and clouds to the ocean circulation in mathematical equations.

Running these models requires the most powerful computers available – also known as supercomputers – as well as large amounts of space to store the model results for use by governments, businesses and scientists alike.

But right now, Australia’s supercomputers are falling behind the rest of the world – and this constitutes a serious risk to our ability to mitigate and adapt to climate change.

What is a supercomputer?

What makes a computer a supercomputer is its computing size and as a result, its ability to perform a huge number of calculations in a very short time.

Australia has two main national supercomputers for research: Gadi and Setonix.

Gadi, located at the National Computational Infrastructure at the Australian National University in Canberra, is the main machine used in climate computing in Australia. It contains a vast number of computer chips known as central processing units (CPUs) and graphical processing units (GPUs). It has more than 250,000 CPUs and 640 GPUs. It is the CPUs that have made Gadi the Australian climate computer of choice.

Compare this with my humble Macbook Pro M3, which effectively sports 11 CPUs and 12 GPUs, and you understand why Gadi is called a supercomputer.

There has always been a strong connection between supercomputing and climate modelling, with climate models steadily improving as scientists access bigger and better supercomputers.

The secret lies in being able to divide Earth into finer and finer pieces and adding more of the important processes that affect our weather and climate. Both enhance the reliability of the model results.

While most climate models divide Earth into a grid of squares roughly 100km in size, the most advanced global climate models today simulate the behaviour of Earth’s atmosphere, ocean, land and ice using a grid of only a few kilometres. It’s like going from a grainy black and white television to an ultra high-definition one.

Doing so requires the most advanced supercomputers. These include LUMI in Europe and the Frontier machine in the United States.

These big machines aren’t just tools for climate scientists. They also underpin the operational delivery of climate information to all sectors of society safeguarding property and lives in the process.

A kilometre-scale climate modelling system for societal applications has just been developed in the European Union. Known as the “Climate Change Adaptation Digital Twin”, it represents a major leap forward in our understanding of how climate change will impact Earth – and our ability to respond to it.

How does Australia stack up globally?

So how does Australia stack up in the quest to have a supercomputer that can produce the best climate information possible to future-proof our nation?

The Gadi supercomputer is currently ranked 179th in the world. It was in 24th position in 2020, when it was introduced.

For comparison, the Frontier supercomputer is ranked 2nd. The LUMI supercomputer is ranked 9th. Topping the list is El Capitan supercomputer in the US.

In May 2025 the federal government announced A$55 million to renew Gadi.

This is roughly two-thirds of the funding it received for its previous upgrade in 2019, and will only lead to a moderate increase in our climate computing abilities – well behind the rest of the world.

A major disadvantage

This puts Australia at a major disadvantage when it comes to planning for the future.

But why can’t we just use the more advanced models and supercomputers developed elsewhere?

First, apart from our own ACCESS global model, all climate models are built in the Northern Hemisphere. This means they are calibrated to do well there, with limited attention paid to our region.

Second, making good decisions about Australia’s future requires us to be self-sufficient when it comes to simulating the climate system using scenarios defined by us and relevant to our region.

This has recently been brought into sharp focus with recent cuts to climate science in the US.

In short, good decisions on our future require self-sufficiency in climate modelling. We actually have the software (the ACCESS model itself) to this, but the current and planned supercomputing and data infrastructure to run it on is simply outdated.

An ambitious solution

Learning lessons from the international community, it is time to think big and integrate the power of existing climate modelling with the emerging abilities of artificial intelligence (AI) and machine learning to build a “digital twin” of Australia.

With weather and climate at its heart, the digital twin can enable directly integrated new major features of Australia such as its ecosystems, cities and energy and transport systems.

The cost of such a facility and the research and operational need to enable it is large. But the cost of poor decisions based on outdated information could be even higher.The Conversation

Christian Jakob, Director, ARC Centre of Excellence for the Weather of the 21st Century, Monash University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read More........