In this blog, I'll explain to you terminologies used in Dialogflow.
Dialogflow is the Platform where once could build Natural Language-Based APP for multiple platforms like Google Assistant, Slack, Alexa, Messenger, Skype, Telegram and many more.
It mainly requests the designer/developer to create the intents/entities to capture the main keywords from the user’s queries.
User:- Okay Google, What is the temperature in Mumbai?
Dialogflow:- Intent:- Mumbai(Entities:- City).
User:- Hey Google, Order a Margherita Pizza for me.
Dialogflow:- Intent:- Pizza(Entities:- Food, Margherita (Flavour)).
Now, with the above values, it would depend on the designer to send the answer to the user in the form of the temperature API and related stuff.
The main thing is Dialogflow internally designed the Machine Learning training and it has been owned by Google. So, the accuracy of the answer is also at a remarkable level.
Here are the few terms used in Dialogflow.
Agent handles conversations with the end-users. It is a natural language understanding module that understands the nuances of human language. You design and build a Dialogflow agent to handle the types of conversations required for your system. One agent handles the one Action. It consists of Intents, Entities, Fulfillment and Integration with other platforms. Read more about Agent
These are the already trained agents. A developer needs to import the agent. Prebuilt agent general-purpose agents and used in most of the action just like the libraries. For example, Agent for Navigation, News, Time, Translate, Weather etc.
If we go by the dictionary meaning, it is the intention or purpose. In our case, it is the intention of the user. The bot needs to understand the intention of the user, and needs to know what the user wants from this chat.
In case of any chat platform, the intent is the intention of a user. The AI designer trains the bot to understand various user intentions based on sample training data set. Then, when a user talks to the bot, the AI does Natural Language Processing (NLP) and tries to identify the user’s intention based on the training it has received.
For Example, If a user wants to order a PIZZA. So intent name can be PIZZA-ORDER Read more about Agent
Training Phrases are used to train the intent. Every intent identification is backed by a confidence score, which means, the AI has guessed the user’s intent with X% confidence. As an AI designer it is our task to identify a base-level confidence score called Threshold Confidence. Supposing the Threshold Confidence is 80%, then all NLP guesses made by our AI having Confidence above 80% will be automatically treated as correct and based on this a response will be generated for the user.
For the given example, the training phrases can be,
Hey Google, Order a Pizza for me
Hey Google, I want to order a pizza
I'd like to have a Pizza
Contexts are a tool that allows Dialogflow developers to build complex, branching conversations that feel natural and real.
Here’s an example of a context. Read more about context
User: “Will it rain in Mumbai today?”
Agent: “No, the forecast is for sunshine.”
User: “How about Delhi?”
Agent: “Delhi is expecting rain, so bring an umbrella!”
While the follow-up, “How about Delhi?”, doesn’t make sense as a standalone question, the agent knows the contextual inquiry is still about rain.
Dialogflow uses contexts to manage conversation state, flow and branching. You can use contexts to keep track of a conversation’s state, influence what intents are matched and direct the conversation based on a user’s previous responses. Contexts can also contain the values of entities and parameters, based on what the user has said previously.
The basic response type is a text response. Other types of responses are available (image, audio, synthesized speech, and so on), some of which are platform-specific. If you define more than one response variation for an intent, your agent will select a response at random. These may provide the end-user with answers, ask the end-user for more information, or terminate the conversation.
Response Type available in Google Assistant: Basic Card, Suggestions, Carousel Cards, List, Browse Carousel Cards, Media Content, Table etc.
In every message the user sends to our bot, there is some valuable information like Location, Time, Food to order etc… This valuable information is called Entities. e.g. User Says: “I want to watch some comedy movie tonight.” So, in this case, the User’s intent is to “Watch a movie” whereas valuable information viz. Entities are Content: Movie, Type: Comedy, Timing: Tonight.
Read more about Entity
System Entities are already defined Entities. A developer does not require to train to intent on such a response. For Example, What is the Capital of India
Here INDIA is a part of Country Entity.
Few System Entity: Date, Time, Number, Country, Address, Date-time, Percentage, age, flight-number, geo-capital, unit-length, geo-city, unit-currency, URL, language, colour, person, temperature, email etc.
Custom Entities are use defined entity. It is not possible to have entity for each and every element so Developer can defined its own entity. For Example, Entity SPORT includes Cricket, Football, Volleyball, Hockey, Tennis etc.
Reply from Response tab is custom. Fulfillment provides dynamic Response. Fulfillment is code that's deployed as a webhook that lets your Dialogflow agent call business logic on an intent-by-intent basis. During a conversation, fulfillment allows you to use the information extracted by Dialogflow's natural-language processing to generate dynamic responses or trigger actions on the backend. Developer can use fulfillment in two methods. Read more about Fulfillment
Webhook is used when code is too large and uses external APIs.
Inline Editor is used when code is small and uses APIs Provided by Google Cloud Function.
A Developer can integrate the Actions with many platform such as Telegram, Cortana, Skype, Web Integration, Bixby, Call etc. Read more about Integration
Thank you for reading this blog, Hope you liked it.
Open for the suggestions!