Mitigating Quantum Errors using Clifford Data Regression

How to tap the potential of the current noisy quantum computers

Do you want to get started with Quantum Machine Learning? Have a look at Hands-On Quantum Machine Learning With Python.

Image by author

Quantum computing has been touted as a potential game-changer in areas such as cryptography, machine learning, and artificial intelligence. While some believe that quantum computers are already within reach, others argue that they are decades away from being commercially viable.

The main obstacle is that quantum systems are extremely fragile. This makes them very sensitive to environmental disturbances and prone to errors. So, it is fair to say that the practicality of building a working quantum computer remains a challenge. It is a challenge of utmost importance and actual interest.

For instance, in their second Quantum Open Science Prize, IBM offers $100,000 in prize money to those whose algorithms can best handle errors.

IBM provides a working example of simulating a Heisenberg model Hamiltonian for a three-particle system using Trotterization. We can run it on our laptop, and it produces a fidelity (that is the performance score) of almost 1.0 — that is the best we can get.

--

--

Frank Zickert | Quantum Machine Learning
Frank Zickert | Quantum Machine Learning

Written by Frank Zickert | Quantum Machine Learning

You're interested in quantum computing and machine learning. But you don't know how to get started? Let me help