Transcript

Think about the last time you answered someone's question. How did you do that? You paid attention. You listened. You understood the ask. You interpreted the question in the context. You must have asked a couple of follow up questions and then you scoured for answers based on your memory and your previous experiences, your learnings, your insights, perspectives. And you finally arrived at a response, and you presented that as an answer.

Think about it. This is the exact same process we do every single time. When our kids ask us for an extra scoop of ice cream, when your boss asks you for that report, our innate capabilities help us to look through this unstructured universe of data and to present that information as answers within seconds.

What if we could build applications to answer questions in a similar fashion? The question answering feature from Azure Cognitive Services helps us with just that.

Traditional software paradigms allow users to interact with applications using simple mouse clicks. You can click a button or a menu option, you can navigate a page. But instead of that, imagine you have an open text box. The users can type in their query in natural language like chat bots. How can we build applications that allow users to interact with us in their natural language? So simple, you create a knowledge base with question, answer pairs and point the assistant or the bot to it. So, when the user asks a question to the bot, the bot will search the knowledge base and will return the most appropriate. Simple, right? Not quite, here's why.

Let's say I built a bot for this AI series. Now, I could ask the bot, hey, when is the next episode? The bot looks through the knowledge base and gets the response. They catch? Different people could ask the same question in different ways. So, someone could ask what time is the next episode, or do we have an episode next week, or when will you release the next episode? All of these essentially point to this very same answer.

So how can we accomplish this? One we need a natural language processing service. It allows you to create a natural conversational layer over your data and helps you find the most appropriate answer for the query. We need to have the ability to not just add question answer pairs into the knowledge page, but also add suggestions for alternate questions that our users could potentially ask. Sometimes we may need to ask follow-up questions. Not all questions can be answered in one shot, so we need to have the ability to handle complex multi turn conversation. Sometimes we don't have all the question answer pairs. We need to support the ability to extract question answer pairs from important documents of FAQ pages, etc. Maybe add a little bit of personality you know, chitchat questions, etc. All of this and more is supported in the question answering feature from Azure Cognitive services for language. Let me show you how.

So let me show you how some of the features that we discussed in this episode with question answering works. So, to get started, as always, we need an Azure subscription. If you have an Azure subscription, we're good to go or we can create an Azure subscription. It's a trial subscription and you get a free Azure credit with that. Once you have that, you can head over to the marketplace and you can search for language resources. The easiest way to find that is to search for language and you will see the language service. Once you click create, you can provide the details for your language service, which would be the name of your service, the location, the Azure resource group, etc. Give it a minute or two and it will create a language service for you. Of the various parameters that you have in that once it's created, two things to note are one, the key, and then the endpoint. The key and the end point would be useful when you have to invoke the API from code. It will help authenticate your APIs, post which you can send in your text strings and you can parse the API responses. So, that's one way of doing it. You can start with the preferred programming language of your choice. You have SDK support. You have rest APIs that you can invoke, and you can test these features out in code. An easier way of doing that without writing any code is by means of using a web portal called Language Studio. This is from Microsoft. Again, you can try out a lot of natural language features in this. The one that we're going to try out today is the answer questions feature, so the question answering feature as we try this out, we see that we can provide a text language to start with. We can associate or reference and Azure resource and then we have a couple of options. We can upload a document of our choice if you'd like or you can try out any of the samples here. So, if I were to choose one of these samples. Looks like this is about the Space Needle in Seattle, so there are two paragraphs on that, and I'd maybe go in and ask a simple question based of this paragraph. And once I run the service, we see that I get what the API returns as the top answer. So, my question was, ‘what is Oculus?’ and there is a long answer to that, there's a short answer. There's also a confidence score that the API returns. So, the confidence score is literally just that. It shows the confidence that the API has in the response that it has provided. And in that text you can see it's specifically where the API returned that piece of text from. So this is one way of doing it. You can try out answering questions like that or another feature is the custom question answering feature. So, this is when you have specific question answer pairs that you can provide, and you can sort of train your model with. So, I have created a project called custom Question answering and I have added a couple of sources here. Now how you add a source? Very simple. You would go here, and you would share. You would add via URL. So, if you have an FAQ page on your website that you are good to go with, you can feed that URL in here. It will automatically parse that into question answer pairs. You can upload files. You can also upload chit chat. So, chit chat is a very interesting feature. Sometimes, especially when it comes to conversational agents and chat bots, you might want to add little personality to your bot and might want to encourage it to also answer, you know, some chit chat questions, small talk questions. So if you get to go that route, you know you can choose from chit chat and then you have an option of choosing up to five from one of those five personalities, and you can, depending on the personality that you would like to give it. So, if you choose professional, you know the application would take on a professional tone of answering those small talk questions. Likewise friendly, witty, caring, enthusiastic. This is very interesting because out of the box is..I've created a friendly chit chat source here. So out of the box, what I get is, you know, hundreds of questions that take care of all these small talks, you know, related conversation. So, if the user says something like ‘I'm exhausted’ or ‘I wish I could nap’, you know you already have readymade answers to that, so likewise. So, along the same lines, you can create your own sources. So, you can add your own custom question answers for your specific application or your domain. For example, the one that I've added here is the one that we saw in the episode earlier. So, if I can just narrow this down, we can focus on this. You can add new question answer pairs, how you would do that is by just clicking ‘add question answer’, you would type in your question, you type in your answer, and you would hit ‘save’ for a specific question itself. You can add as many alternate faces as you want. So, you gather to the different utterances that the users may use, and you are training the model by saying that irrespective of whichever of these utterances that you obtain, this is the answer that you need to provide. You can test within this own portal, and you can see the results for yourself. You can even enable rich text if you'd want to include audio and video images as part of your answers. That is possible. You can add follow up also, it's not a single short question answer, It's a dialogue flow. You have multi tiers that you can create a pretty powerful scenario. Yeah, so you can go in and try all of that and then it would like to, you know, take this as step further and really start working in code. One of the things that you can do is delve into the documentation from Microsoft. So you can go back here and take a look at the documentation, there are samples on GitHub. There's also, you know, SDK support as we discussed and so you can try all these out and you can see the specific you know question answer endpoints that you might want to call the API details etc.

Don't be afraid to ask a question. As Einstein said, the most important thing is that we never stop questioning. Thanks to the question answering feature from Azure Cognitive Service for Language, now we can build apps that will entertain questions from your users in natural language. They are no longer restricted to choose from a set of options. When they ask, they get responses from an app that they can trust anytime, anywhere.

True freedom is that right to question.