女生小视频

Technology

We might finally know how to use quantum computers to boost AI

Pushing against years of scepticism, an analysis suggests quantum computers may offer real advantages for running machine learning and similar algorithms in the near future

By Karmela Padavic-Callaghan

20 April 2026

Abstract system of quantum computing

Quantum computing and AI may one day work together

NESPIX/Shutterstock

Quantum computers might eventually be able to handle some AI applications that currently require huge amounts of conventional computing power. Such a development would be a major boost to machine learning and similar artificial intelligence algorithms.

Quantum computers hold the promise of eventually being able to complete certain calculations that are impossible for conventional computers. For years, researchers have been debating whether these advantages over conventional computers extend to tasks that involve lots of data, and the algorithms that learn from them 鈥 in other words, the machine learning that underlies many AI programs.

Now, at the quantum computing firm Oratomic and his colleagues argue that the answer ought to be 鈥測es鈥. Their mathematical work aims to lay the foundations for a future where quantum computers offer a broad boost to AI.

鈥淢achine learning is really utilised everywhere in science and technology and also everyday life. In a world where we can build this [quantum computing] architecture, I feel like it can be applied whenever there’s massive datasets available,鈥 he says.

His team鈥檚 work addresses the key question of how data collected in the non-quantum world, such as restaurant reviews or results from sequencing RNA, could be input into a quantum computer in such a way that the computer鈥檚 quantumness can be leveraged to process the data, and learn from it, more efficiently.

Free newsletter

Sign up to The Weekly

The best of New 女生小视频, including long-reads, culture, podcasts and news, each week.

New 女生小视频. Science news and long reads from expert journalists, covering developments in science, technology, health and the environment on the website and the magazine.

This requires putting all of the data into a 鈥superposition state鈥, which is a mathematical combination that cannot be created in non-quantum machines. But until now, researchers thought that performing this task would be impractical. This is because they assumed that all of the data in that superposition state would have to be saved into dedicated memory devices prior to being processed by the quantum computer 鈥 but those memory devices would have had to be impossibly large, says team member at the California Institute of Technology.

Huang and his colleagues took a different approach that doesn’t require such memories. It involves inputting the data into the quantum computer in smaller batches, without having to save it all before beginning to process it, similar to streaming a movie rather than downloading it in full prior to watching it.

They showed not only that this approach can work but that it would allow the quantum computer to process more data at a smaller memory cost than any conventional computer.

The memory advantage is so large, in fact, that a quantum computer made from about 300 error-proof building blocks called logical qubits would outperform a classical computer built using every atom in the observable universe, says Zhao.

We are maybe many years away from building quantum computers with 300 logical qubits, but Huang says that a 60-logical-qubit computer could plausibly be built by the end of the decade. The team鈥檚 analysis suggests that, at this size, there would already be a notable quantum advantage over classical computers for some tasks that involve processing large datasets and that AI is used for.

鈥淭he quantum machine is a very powerful device, but you do need to first feed it. This study talks about feeding and how it鈥檚 enough to load [data] bit by bit, without overfeeding the beast,鈥 says at ETH Zurich in Switzerland.

Nevertheless, he says that many questions about applying the new work to actual devices and real-world data still need to be addressed. Many past quantum machine-learning algorithms were eventually shown to be amenable to 鈥渄equantisation鈥, which is a process where the the algorithms were adapted to no longer require any quantum hardware while retaining their excellent performance. It will be important to examine how crucial quantumness is to this new algorithm too, says P茅rez-Salinas.

at Leiden University in the Netherlands says that the new work could be a good match for large scientific experiments such as at the Large Hadron Collider where millions of gigabytes of data are continuously created, but most of it gets discarded because of insufficient computer memory.

But it鈥檚 likely that only some current AI applications and similar kinds of data processing will be amenable to being handled with a quantum computer rather than with a data centre full of conventional servers, he says. 鈥淭his is not the majority of what GPUs are heating up the planet for, but may still be important,鈥 says Dunjko.

The researchers are now working on both expanding the kind of algorithms that their method could be useful for and devising new ways to configure quantum computers that would make them sufficiently fast to handle data not just with very little memory but in a practical amount of time.

Reference:

arXiv

Topics:

Sign up to our weekly newsletter

Receive a weekly dose of discovery in your inbox. We'll also keep you up to date with New 女生小视频 events and special offers.

Sign up
Piano Exit Overlay Banner Mobile Piano Exit Overlay Banner Desktop