Logo
  • Article

QuBites 6.8 - Progress in Quantum Machine Learning

  • Article

QuBites 6.8 - Progress in Quantum Machine Learning

Rene Schulte February 01, 2023

Reading:

QuBites 6.8 - Progress in Quantum Machine Learning

Get More Articles Like This Sent Directly to Your Inbox

Subscribe Today

In this episode of Qubites Season 6, we explore the latest advancements in the field of quantum machine learning. Our guest, Johannes Oberreuter, is a renowned expert in the field and will shed light on the current state of quantum machine learning and its potential applications.

 

Transcript


Rene - Hi, Welcome to QuBites, your bite-sized pieces of quantum computing. My name is Rene from Valorem Reply, and today we're going to talk about the progress in quantum machine learning. I'm honored to have a special expert guest, Johannes Oberreuter, joining us. You may remember him from previous conversations about quantum machine learning in Season 1, Episode 6, and Season 4, Episode 1. Hi Johannes, and welcome to the show. How are you today?


Johannes - Hi, Rene. I'm very good. Thanks for having me again. It's a great pleasure to be back. How are you doing?


Rene - I'm also doing fantastic, and you know, wrapping up the year. Can you tell us a little bit about yourself and your background for those folks who haven't seen you on previous episodes?


Johannes - I'd be happy to. My name is Johannes. I'm a physicist by training. I did a PhD in string theory and then two post-docs in German universities on quantum many-body dynamics before joining Reply as a data scientist. I'm now co-leading the innovation group for quantum computing and the quantum computing practice of Reply in Germany. As such, I'm also doing a lot of research into the applications of quantum computing.


Rene - Awesome. Let's talk about our topic for today, quantum machine learning. In Season 4, Episode 1, we already mentioned and talked about QML, related to image classification. Back then, you and your team were working on a challenge where image classification was being used. Can you explain it a little more and also tell us what has happened since then and any news or learnings to share from that?


Johannes - Certainly. Back then, the challenge posed by BMW and co-hosted by AWS was to classify parts from production images taken by cameras on the production line as defective or not. This was not a very novel use of image classification, but here, the interesting thing was to see if we could get extra learnings and benefits from using quantum computing. It was really cool and interesting to explore this in a real-world case, which is always more complicated than you would hope for.This was a really interesting challenge and our approach was well, we were not really expecting to gain an advantage in quantum computing from this problem because it is a problem which, as I said before, is not super new. It's something we have done before and we know these technologies work very well. So, we aimed for something very general, versatile, flexible, and something that you can use in various circumstances and control the individual parts very well, and try to learn something general for quantum computing and machine learning. The result of this challenge was actually quite encouraging. Namely, the quantum architecture we used and trained was comparable with the classical approach. Given the state of the technology, we found it very encouraging. Now, the question is, of course, how well would that work in practice? That is, of course, what we were aiming for. And we have to admit that most of our results were generated in simulators, not real quantum machines. Of course, we also tried quantum machines, but there are many topics to learn, such as new algorithms, the specifics of the hardware, and how to handle all these problems. There is a lot of experimentation going on. On the one hand, we trained and evaluated our architecture and setup with simulators, and on the other hand, we tried it on actual hardware. We realized that the results on the hardware were reversed, with all the errors and noise that you get in practical computations. The simulators showed a lot of potential, but the actual hardware still has problems with noise and availability, but those were all known issues and not something we were surprised about. It was just interesting to confirm that.
Given that, we still believe or work on using this approach and broadening it to more use cases because the hardware is catching up. There's a lot of progress going on, and we really want to be there when the point happens where the power is there and we can use it. Meanwhile, what we have done is to generalize the approach a bit and have a classical part which leverages the benefits of pre-trained CNNs to reduce compute costs and a quantum part which helps to embed the information in a quantum state and reap some benefits from that. In this quantum state, you have a lot of parameters to choose from, and it's not necessarily clear which one you should increase. One thing you kind of know you want to increase is entanglement, which is this strange behavior of quantum particles that they can still talk to each other, although they are far away. This has been a long controversy in physics. Like, why is this? Isn't this what Einstein said? If this is true, it's the end of physics as a science. This strange behavior is measured and there was a Nobel Prize for it this year, for people like Tiling who have worked on proving this state in the lab. We understand why the entanglement happens and it's accepted now. This is the resource for quantum computing that we want to produce. There are various ways to do it, and different ways may be an advantage for different data. We are working to make exchanging quantum circuits easier and to benchmark various quantum circuits against each other. For image classification, there wasn't much benefit to using quantum computing, surprisingly. We thought there would at least be a drop in accuracy for certain architectures, but that didn't happen. So, the next step is to try it on other tasks, such as sensor data, which I am very interested in. We are also looking at the classical part of the setup, such as convolutional neural networks and Transformer models. Transformer models have shown astonishing capabilities in modeling language, but recently it's been shown that they can also be used for image classification. This is what we are trying to achieve with our quantum model, to see how versatile it is.
Another aspect we're looking into is combining pre-trained Transformer models with our quantum idea. These models are huge, with millions of parameters and a cost of millions of dollars to train. So, we're looking into if we can use pre-trained models in our approach. We have been talking about one aspect of machine learning, but it's a very broad field with many applications. In the area of language modeling, people in Oxford have an idea that language should inherently be modeled on a quantum computer, which is different from what we have been trying to do so far. We have been trying to solve specific problems on a quantum computer, but now we would like to see if adopting this approach, combining the meaning and grammar, will be interesting.
If you're coming from the other side, so to speak, and trying to devise specific models for specific applications like languages or images, you can also devise quantum filters specifically for images. This would be an advantage, but of course, you have to do it for every problem and problem class again. Although it is not something that has been experimented with a lot yet, it is something that we are going to do in the future and have already started.

Rene – Awesome. And you already mentioned what is the latest research on QML-based classification. You already talked about the hybrid approach. Is there anything else you want to talk about?


Johannes – I think it's quite important that we now get a deeper and grounded understanding of which technologies to use in which case, similar to the level of understanding we have in classical machine learning. In classical machine learning, if there's a problem, most engineers and data scientists would come up with a similar solution and say, "Okay, in this setup, you have these two possibilities. You can do these architects that are new on the market. That's what you should try." At the moment, for quantum computing, this is not the case yet. There are a few cases where we understand what to do, but especially in quantum chemistry, there's a lot of stuff where we understand that quantum computing can give an advantage. For quantum machine learning, my feeling is that we are trying to apply it to everything, but I don't believe that there's an advantage everywhere. We should understand where we can really benefit from which setup of quantum computing.
In order to do this, we need a structured way of benchmarking use cases and architectures. The interesting thing is that new things are coming every day or every month, but it's very often that there are new startups coming up, new strategies to do architectures, new money coming into the field, and new developments also on the software side (SDKs). It's not that we understand how this field will look in 10 years. It's very unclear if a specific approach to implement quantum computation would suddenly improve the situation, reduce the noise, be better at error correction and error handling. All of these things are something we need to understand.
We are currently collaborating with some other research institutes and industry suppliers who are interested in benchmarking specific use cases and providing a framework to do this in a structured way. There's more to come, and it looks very nice, but we hopefully will collaborate on this for quite a while. It's almost, but not quite, official yet, but I'm super excited about this development because I think it will be a big advantage for the industry if you have a box where you can put your use case in, and it will more or less spit out which architecture would be best recommended. This will also help us to understand better how to do that."

 


Rene - Yeah, actually, let's continue on the topic a bit and talk about the long-term vision of quantum computing and what we're doing at Reply. And any thoughts there?


Johannes - So, at Reply, this is one of the activities we're excited to get involved in. On the other hand, of course, we're doing consulting. We have clients, and we're helping them with our insights. We come from a broad approach where we have researched all kinds of applications and trends. I would say we're now in a position to start shaping and offering for individual industries. One of the industries that's always been interesting for quantum computing is the financial industry. There, we're doing some research with clients, but we also have some algorithms with hands-on experience, which we can offer to our clients to implement and help put in production with quantum-inspired architectures. We use simulators to apply the quantum algorithms. The financial industry is certainly something we can shape and offer. By the way, we're finalizing a paper on portfolio optimization, with a specific usage of this. We show some tweaks and tricks we uncovered and came up with to make life easier, which could be interesting to the industry. In this direction, we're still working a lot. Other industries, such as automotive or engineering, are also interesting because a lot of the problems are similar. For example, materials simulation and machine learning are relevant for all engineering applications. We have already developed some example cases, which we can now offer to the market. I would say that we have enough experience to focus on specific industries somewhat better. We also keep contributing to the development, for instance, by benchmarking and writing papers. It's developing and I don't know what will be interesting next year, but we're continuing along the lines of our experience so far and taking on whatever the research community offers and the new insights.

 

Rene - And like I said, it's so fast, and many things are happening, for instance, in quantum chemistry. you just mentioned about quantum chemistry, I had a really uh impressive conversation with Shahar Keinan in episode six of season six when we talked about personalized medicines that they're developing and shortening the path for drug Discovery, making something that took like four years they cannot do it in a few months using a Quantum optimization and it's life-changing for what could potentially be life-changing for people and it's impressive how fast it's going and like you're saying, we'll see next year right what what other things will be happening there and yeah it's a really fast-paced industry and impressive.

Johannes - It's always a good idea to be skeptical. Especially on our side, we're not entirely independent of other players in the market. There are a lot of startups, but it's not established enough that we can be entirely confident that the hardware will be there.

This is the exciting thing that we are co-developing at the moment: hardware and software applications for quantum computers. While it's good to be skeptical, it's also good to understand that the market players who have been delivering the progress they have promised so far. Of course, results in the past are never guaranteed for the future, but if this continues in the way we have seen, we'll be at the point in 2025 where we can run applications on quantum computers in an industry setup. This is what the vendors are envisioning. I can't promise this, but I'm working with this assumption because I feel the vendors are trustworthy so far. This is not far in the future, it's within our professional lifetime. Getting these new technologies into organizations is tricky and there's a lot involved, such as identifying the cases where it can be useful, understanding how to do the procurement, the safety concerns, educating people and implementing the algorithms. This is why we have been on this topic for a couple of years and why we're excited to help our clients understand the benefits of the technologies for them.


Rene - It's a current topic, and you mentioned the noble price in physics that went to Anton Zeilinger and John F Clauser and Alain Aspect for you know basically pioneering experiments with entangled photons and proving that entanglement is not just a theory and then there's like, when was it, last month or so there was another new paper published and also they had a big article on Quantum magazine about this you know wormholes that they simulated on the quantum computer and basically showing the ER equals EPR, Duality basically showing that one horse and quantum entanglement are the same things and well it's impressive like the progress that is happening like you're saying in a couple of years we will really uh know much more and um and it's not like not unlike nuclear fusion which like you're saying might take like you know decades this year we will know much more in a couple of years and yeah super exciting time.


Johannes - I'm trying not to look down to any other technology. The progress in quantum mechanics is similar to the progress in nuclear fusion, in that it works but simulating it is difficult. The Nobel Prize honored people who showed us that the mathematical framework is correct and useful, and even if we don't understand all the philosophical underpinnings, we can still work with it because we understand the mathematics and how the systems behave and like with Fusion it has proven super difficult to get this system into a stable state and in a way we are at the same point with Quantum right here. Now you want to get a big system, which can deliver power like a fusion reactor. There are problems, but there are also a lot of people and ideas that may offer something different. With all these startups, the universe is rich with possibilities for implementing quantum computing. It's not as expensive as building a fusion reactor, which is why I am more hopeful and confident that quantum computing will develop at a faster pace. Ideally, we would build a quantum computer first and then use it to help solve the energy problem of nuclear fusion. This is one of the applications of quantum computing, as it can be used to model fluid dynamics. Quantum computing will be an amazing tool in our tool belt to solve these complex problems, and once we have it, we will be able to use it in various ways.


Rene- We are at the end of the show now. Thank you so much, Johannes, for taking the time to talk to me. I enjoyed it. Thank you everyone for joining us for another episode of QuBites, your bite-sized pieces of quantum computing. Watch our blog, follow our social media channels, and subscribe to our YouTube channel to stay informed about upcoming episodes. You can also find previous episodes and full transcripts on our website. Thank you again, take care, and see you soon. Bye.