1. 程式人生 > >Intro to Developing Chatbots In Azure

Intro to Developing Chatbots In Azure

Intro to Developing Chatbots In Azure

The growth of Artificial Intelligence might be scary for some people. There is the fear that soon robots with attitude will be taunting them to “Bite my shiny metal ass!” That is however ignorant. These are not the droids we are looking for. The reality is that they can be easily misled by the most elementary of

tricks. Last month I analyzed Microsoft Zo, and ventured to discover if she can pass for a real girl. While she was shallow, I expect that there are those here that want to build their own bots.

“I’ll make my own chatbot, but with blackjack and h..!”

You can start for free with an Azure account, and there are three bot frameworks to choose from. The first option is a pre-built AI, but you can also choose to code a custom unit or train a conversational AI.

Azure AI Homepage

The Pre-Built bots are nearly complete machines that can be inserted into a website or application with only a few lines of code. You don’t have the option of training or engineering their behavior.

Your next option is the Conversational AI. It is a complete algorithm with full capacity to interface with Facebook, Slack, Cortana, and other services. It is still up to you to train the bot.

Finally there is the Custom AI. With the custom service you can develop models on your desktop, test them in the cloud, and execute them on a variety of machines. You have the option of running the bots on deep learning virtual machines, Spark Clusters, and Azure Batch AI services.

Nevertheless all of these architectures share a common core, they are all designed to task Natural Language Processing or (NLP). This is of course very challenging, because human language is a messy jumble of meanings.

People speak in local dialects-Almost died first year I come to school and et them pecans

Words can have double meanings or Homonyms-You know your right, now take a right turn.

People use figures of speech-I could do this forever.

Luckily for the bot engineers, you have been providing them bountiful data for decades in the form of messaging systems and social networks. Engineers have used this immense data as a foundation to train the machines on converting human language into machine readable language. The key elements of a language are:

Entities are nouns and can represent people, places, and things. The widespread availability of search engines makes this analysis easy.

Relations are used to connect nouns. They can be descriptions like small things.

Concepts are built on the former. They could be like “He excelled at sports.”

Sentiments specifies how positive or negative the conversation is. Mastery of sentiments is an essential feature of a customer service chatbot.

Emotions are a specialized form of concepts. They are very minute elements of language and the most challenging form of analysis.

Keywords are an critical element of NLP. It is executed by indexing and searching specific words and phrases.

Categories is the culmination of the analysis. Its job is to classify the type of conversation. This is very relevant towards applications where you need to recommend content or organize messages.

Microsoft uses its own Language Understanding Service or (LUIS) to assist developers in extracting the meaning of a chat. The application is designed to extract many of the previously mentioned inputs

Intents are the end goal of any conversation. They could also be viewed as commands like “Book flight”.

Dialog is the evolved form of intents. It could be a sentence like “Book a flight to Sydney”.

Entities are the same thing mentioned earlier. That would be Sydney from the last sentence.

You are now ready to build your first bot.

You will first need to establish a (LUIS) account to get started. This can be done at the LUIS homepage. The system constructs the protocols for setting the queries for your bots conversation. To begin, you will have to launch your application bot within LUIS. After you have validated your identity, you will have the option of asking the bot the type of questions supported. The built-in QnA tool is designed to return valid questions and is intended for quick development. You also have the option of training the bot with the Bing Search API and the Text Analytics API, but you should expect a longer development.

Bing Search is intended to convert speech to text and is optimized for voice triggered mobile applications.

The Text Analytics API is much more rigorous and is intended for sentiment analysis as earlier mentioned.

Furthermore this takes us to the prominently important confidence score of speech recognition. This is the crux of chatbot training. All speech recognition is fundamentally a data mining exercise and if you don’t believe me then try reading this in another language.

{

“Query”: “Hey there pal”,

“topScoringIntent” : {

“intent” : “Greeting”,

“Score” : 0.9887

This is the root dialog. LUIS interpretes the intention of each query and returns its confidence in the intention. The higher the confidence, the better the comprehension of the intention.Each dialog can also be viewed as independent because its intention invokes a specific function. The response is then passed on to the next function and the process repeats. This task is known as waterfall dialog and it allows the developer to divide out the specific functions of the bot.

Bot architecture

The bot itself is constructed of an array of functions, with each function executing a specific task.

Chatbot Brain

In this demo the Bot functions as the UI interface for the application and it can accept all inputs from email, Skype, and other services. The acceptable inputs can be set in the configuration of the chatbot.

The Bot Brain is controlled by an array of functions with each function tasked with processing a specific intention.

In a typical bot the tasks are Ask Who, Learn More, and Answer Questions. Each task functions independently and can be debugged separately. These tasks can also be removed or upgraded as bot technology evolves.

For example, the Ask Who task functions off the Bing Web Search and the Bing Image Search algorithm.

The Learn More task calls the Bing Custom algorithm.

All of these tasks exist separately within the Microsoft infrastructure.

The code that executes these functions is both fascinating and complicated, but it is beyond the scope of this story. I will further explore the code that executes these functions in the next chapter. Luckily for you, coding is only necessary for custom chatbots. You already have the lyrical framework necessary to train a bot and insert it into your website. It could also be said that this investigation unlocked the mental depth of bots, Bing Search.

This invites the question.