Rene: Hi! Welcome to Season 2 of QuBites, your bite size pieces of Quantum Computing. My name is Rene from Valorem Reply and today we're going to talk about quantum machine learning with a focus on natural language processing. For this, I'm very honored to have a special expert guest today Alessandro Farace. Ciao Alessandro and welcome to the show! How are you today?

Alessandro: Ciao Rene! I'm happy to be here! I’m fine end looking forward to our chat about quantum NLP.

Rene: Awesome! So before we dive a little bit more into these topics, can you tell us a little bit about yourself and your background as it relates to quantum computing?

Alessandro: Yes, sure. So I've been working now for more than two years at Machine Learning Reply as a Data Science Consultant. I've been mostly doing projects on natural language processing, so dealing with text documents and natural language. But, my background comes actually from physics, so I did a PhD in [Università di] Pisa on quantum optics, quantum information theory, and then I did three years of research as a postdoctoral researcher on developing algorithms for quantum simulations. So, a lot of topics related to quantum computing. And now at Reply we have a quantum computing community of practice where we look at business applications of quantum technologies and quantum algorithms.

Rene: Alright, awesome! Well, again another PhD here in the show, but please we are going to talk about [quantum computing in a way] that also people that don't have a PhD, like me, can understand it. And that's the that's the goal [of QuBites], right? So, let's talk about natural language processing or NLP and first of all, can you explain what is NLP all about what does NLP actually include?

Alessandro: Yes. So, NLP is a is a field of linguistic computer science, artificial intelligence with the goal of processing and analyzing natural language. So, it's about making a machine understand the content of speeches and documents and being able to process information that is not written in a structured way. So, text documents have a lot of variability, a lot of nuances related to language and it can be in many different languages and we want to teach a machine how to [understand] this. So, there are several tasks that fall under the NLP topic from:

  • Speech recognition. So, understanding spoken language.
  • To generating machine text. To translate from one language to the other.

To more sophisticated topics like:

  • Extracting information [from] the documents.
  • [Classification of] documents based on their content.
  • Sentiment analysis. So, we want to understand if a review or a message is a negative sentiment or a positive sentiment.
  • And also a very important topic is question answering. So, finding out the answer to a question from a corpus of documents. This can be also used for fact checking or for chat bots.

[NLP is] really a topic that has many applications even in industry. And this topic also…

Rene: And also conversational AI systems, right?

Alessandro: Yes, exactly. And an important thing is that this topic saw a lot of growth in the recent years thanks to a lot of data available on the web. So, on the web you find a lot of written documents and text documents, text data and also a lot of progress on the machine learning side. So, now we have really powerful models that can understand the meaning of voice, the meaning of documents and they can help us solve all of these tasks.

Rene: Gotcha, gotcha. Yeah! And like you were saying, there was an immense uptake thanks to all the open research that is ongoing. But, also, the immense amount of data and the processing power at our hands available these days, right? And yeah it's super impressive [the progress] you see already with conversational AI systems [where] you might not actually notice that you're talking to a machine, right? And this is the level we are really approaching here. And so now, you know coming to the quantum computing world, how is quantum computing [actually] providing benefits in this field of NLP (natural language processing) and what is the state of the art and latest state of the art in the field?

Alessandro: Yes. So maybe let me start by giving a few words on the state of the art of classical NLP just to set the context. So, there we now [have] very powerful models that are able to understand the meaning of words. So, [machines] learn the meaning of words by looking at the context, using the key idea that similar words will appear in similar contexts and similar sentences, or appearing close to other similar words. So now these models are very good at capturing synonyms and understanding ‘ok, this word really means this semantic meaning.’ What is harder for these models is to understand the meaning of a sentence in a longer piece of text because a word will appear several times in many documents. We have a dictionary of words; we can understand the meaning of words. However, each sentence will appear more or less once. It's very hard to have the sentence repeated. Sentences are much more variable so it's harder to capture the meaning of sentences. Very recently, very huge progress [has been made] with huge models. Like one example is the GPT 3 model. People [may have] heard about it, it was on the news recently. And this model requires a lot of parameters, a lot of training data. It has been trained for a lot of time on very powerful resources and it is starting to give promising results on [the classical NLP] side. So, basically, these models have to learn all the structure and the grammar that we humans use to understand text from a lot of data.

Now on the Quantum NLP side, people recently introduced some new language models from the linguistic side which use grammar as a rule. So, they don’t learn it from the data, but they enforce it as a rule. And this is computationally expensive, classically. So, a classical machine will need too many resources to do that, it's almost impossible if you want to analyze long documents. But it turns out that these kinds of mathematical rules that describe grammar, have a natural formulation as quantum sequences. So, with a quantum computer we can figure out the meaning of words, that is what the models can already do very well now. With the grammar part [added] this should help in boosting all the tasks that require understanding the meaning of sentencing. Because now, the meaning of the sentence will be composed from the meaning of words using these extra grammatic rules that we impose on the sentence. And on top of that, there are also some nice features [with Quantum NLP]. Like there are words that have several meanings in different context. For example, bank could be the building where you go, it could be a riverbank, it could be a place where you sit. And in classical [NLP], you just give it one meaning or it's harder to discern between meanings. With concepts like superposition and other quantum properties, you can naturally encode this viability of meaning for a single word. So, you get a lot of new ingredients that are very hard to do classically which could boost your learning capabilities. And what is also nice is that we now have formulation of this quantum model that works on Noisy Intermediate[-Scale] quantum devices. These are the state-of-the-art hardware that are available now.


And so, people already did the first experiments on limited vocabulary or on small training data. But it's not something that will require a huge leap in technology, it is something that we can start seeing now. And there are also some initial results showing that you can speed up in tasks related to NLP like understanding sentence similarity and word similarity. And this is really one of the key tools that you need for all [NLP] tasks like question answering and classification and topic modeling. So, it's really a promising field. Not just because you can do something faster or better, but you can do something new. So this could really have some potential in boosting our capabilities to do natural language processing.

Rene: Well, that makes a lot of sense. And especially, it clicked a lot for me, when you said one word might have different meanings and this can be expressed with a qubit and superposition. Because [qubits] can have multiple values at the same time or can store multiple [pieces of] information at the same time. It just fits naturally right into natural language processing with quantum computing. Well, that is amazing and what is Reply actually doing in these areas, and can you share some examples?

Alessandro: Yes. So, Reply, Machine Learning Reply, we actually have done several projects in natural language processing related topics. So a lot for example, in customer management for companies that deal with a lot of customer messages and they want to, for example, understand the topics of the conversations. We did a lot in terms of classification of customer requests [to allow companies to] extract, automatically, the information that you need to process these [customer] requests. For example, a customer writes you ‘I would like to change my contract to this and this,’ or ‘I'm moving to a new apartment, please change my address.’ These are lot of messages that you would like to process automatically, to understand what the customer wants and what information is sitting in the text that [indicates] you need to somehow act on this request. Also, sentiment analysis is a very big topic. Like understanding if the customer is complaining about something and why, and how can you react to this before the customer gets really angry? And also NLP plays a huge role in finance and insurance, where like banks and insurance companies are interested in also getting information from external news sources about new risk factors, new trends, understanding what is out there that could help them better shape the risk profile and deal with their investments more carefully. So a lot of [NLP applications are] also discovering new trends, discovering new relevant information from the news, etc. This is something that has a lot of potential for all companies that would like to leverage this kind of information that is available there.

On the quantum side, we are starting now some first experiments on these quantum NLP topics. Of course, this is a very experimental and research [driven] field at the moment, but we are trying to do our first implementations and see and understand what can be done today with the available hardware. So how far can we push it? Or what could be some industry applications of these early algorithms that we can write and use?

Rene: Well, that is exciting for sure and like you said, it's bleeding edge, right? We’re really at the level of research and helping to progress that forward, so that is amazing. Unfortunately, we’re already at the end of the show. We could talk for many more hours for sure! But anyway, thank you so much Alessandro for joining us today and sharing your insights with all of us, that is very much appreciated.

Alessandro: Yes! I'm happy to have been here. It was a nice chat with you. I hope people could learn something interesting about quantum computing. I personally find this very exciting, so thanks for the chat!

Rene: Absolutely! And thanks everyone for joining us today for another episode of QuBites, your bite size pieces of quantum computing. Watch our blog, follow our social media channels, to hear all about the next episodes and of course, also to find the previous episodes, which you can watch anytime, right? Take care, be safe, and see you soon. Bye bye!

Alessandro: Bye, bye!