Manage Learn to apply best practices and optimize your operations.

Researchers race for quantum AI as quantum computing advances

Machine learning is likely to be an early application of quantum computers, as researchers and developers look for the key to a more human-like artificial intelligence.

Researchers have been exploring algorithms that would allow computers to process data at the quantum level on a theoretical basis for many years, but only now are the physical capabilities of quantum computers starting to catch up to this theory. This could create opportunities for quantum AI that could allow for the development of machine learning algorithms with less data.

It's still early, as existing quantum computers face several technical limitations related to encoding quantum data, error-correction and the length of time that calculations take. But researchers looking to create a more human-like type of artificial intelligence may have to overcome these challenges. Some evidence suggests that there is a quantum basis for human intelligence that complements neural networks.

Targets of today's research

In the meantime, AI researchers will need to learn new approaches to building quantum AI.

"Regardless of task, the algorithms that run on quantum computers are significantly different from those designed to run on classical computers," said Bob Sutor, vice-president of quantum computing at IBM Research.

Sutor acknowledges there is still considerable work to be done in terms of developing algorithms for AI within the constraints of today's approximate quantum computing systems. But there has been some early research into artificial neural networks run on the 5-qubit IBM Q Experience device published by a team at the University of Pavia in Italy.

In the short run, quantum algorithm research could also inspire better AI on classical computers.

"There have been examples of scientists discovering more-efficient ways to solve problems in machine learning on classical computers due to what's been learned about quantum algorithms," Sutor said.

For example, University of Washington doctoral student Ewin Tang developed a better recommendation system following his research into quantum AI.

Early work on existing quantum computers also identified AI algorithms that seem to work better than classical computers. For example, IBM worked with Raytheon BBN in 2017 to perform certain black box machine learning tasks more efficiently.

Quantum computing-inspired machine learning algorithms could also enable developers to train models with less data or better understand the structures and categories hidden within the data. Canadian quantum computing company D-Wave has launched a machine learning business unit to help with this called

There are many directions for improving quantum AI algorithms, said Michael Hartmann, associate professor at the Institute of Photonics and Quantum Sciences School of Engineering and Physical Sciences at Heriot-Watt University in Scotland. One line of research is looking at how to make the individual calculation steps of machine learning algorithms faster. Another line of research is exploring how quantum machine algorithms could operate on a lower level of abstraction that is directly linked to the physical operations of the quantum processor.

Dealing with errors

There are a variety of approaches being explored for building quantum computers using different physical phenomena. But they share many common challenges.

"In all efforts to build quantum information processing devices, keeping error rates sufficiently low is the biggest challenge," said Hartmann.

Information stored in a quantum device is much more fragile than information in classical computers.

Existing quantum computers work at very low temperatures so that the materials used in the circuits restrict energy flow and prevent the basic units of information -- qubits -- from changing state. The electrical circuits used in classical computers at room temperature would destroy all quantum information.

Quantum computing depends on maintaining coherence across the qubit computing elements in a quantum system. Qubits are the quantum equivalent of a bit, but it can be used to encode significantly more information than a bit. They are also more prone to errors. Today's quantum computers are considered approximate systems. They have errors and short coherence times within which to run algorithms.

"There are physical device limitations we still need to overcome before we have fault-tolerant universal quantum computers that operate with the stability we expect from classical computers," IBM's Sutor said.

As it turns out, some quantum AI algorithms are less impacted by errors. Hartmann said the Quantum Approximate Optimization Algorithm is a strong candidate to run on near-future quantum computers, as it does not require quantum error correction, which have a large resource overhead.

Another big challenge lies in encoding the data into quantum memory systems in a way that maintains its quantum state. However, loading classical data into a quantum memory is demanding.

"If you want to exploit the ability of a quantum computer to handle much more data than a classical one, you still need to convert the classical data into quantum data at the beginning," Hartmann said.

This can be the dominant effort when building a practical algorithm. And this process must be repeated for each new machine learning application because researchers have not found a way to store quantum state data for very long.

Hope for human-like AI

A popular idea suggests that we are on the verge of creating classical computers with more processing power than humans possess, which could lead to sentient machines, or the singularity.

But researchers like Roger Penrose and Stuart Hameroff think this is off base. In the mid-90s, they postulated that there may be a quantum basis to human intelligence with their Orchestrated Objective Reduction theory. The implication of the theory is that AI will have to move beyond classical computing models if it is going to replicate human-like intelligence.

Late last year, Pavlo Mikheenko, associate professor of condensed matter physics at the University of Oslo, found some physical evidence that quantum strings exist in human brains. He noted that this work is still early, and he is waiting for others to confirm his findings.

"My research directly suggests that there is a quantum aspect to biological intelligence," he said. "The brain seems to be both superconducting and quantum."

The one big challenge in replicating the quantum information processing of the brain in quantum computers is that information processing structures in the brain appear to be highly dynamic and seem to break down after a few minutes only to be recreated within the cells. In contrast, existing approaches to building quantum computers are designed to work with stable quantum structures.

"It would be helpful if the relevant quantum computing technology was the same as the memory storage rather than having to search and introduce memory, Hameroff said. "Quantum processing relevant to consciousness runs on the same structures encoding memory."

Getting some practice

It may be a while before quantum hardware catches up with the theory. But in the meantime, developers can learn some of the basic principles by tinkering with quantum computing cloud services like IBM's Qiskit and D-Wave' Leap Quantum Application Environment.

"AI is for many developers, on the top of the agenda for the first applications of quantum computers that are currently being developed," said Hartmann.

Existing quantum computers only demonstrate a proof of principle, but Hartman sees many companies pushing hard to achieve the first demonstration of quantum computing power that exceeds the capabilities of classical AI.

"I expect the demand for it to be huge once it is there," Hartmann said.

This was last published in March 2019

Dig Deeper on AI hardware

Join the conversation

1 comment

Send me notifications when other members comment.

Please create a username to comment.

How do you think quantum computing will influence AI?