DEV Community

Cover image for Chat GPT Powered Teams Private Virtual Agent
david wyatt
david wyatt

Posted on • Updated on

Chat GPT Powered Teams Private Virtual Agent

chatbot demo
Demo based on Amazon Delivery Info here

Full Private Virtual Agents are great, the lates update to its NLP and auto topic creation make it incredibly powerful. But unlike Power Automate and Power Apps its locked behind expensive license and usage costs.

So if we wanted to generate multiple knowledge article chatbots bots we really need a different option.

That leaves 2 options, Power Apps and Power Virtual Agents for Teams. Surprisingly Power Apps has more power to create intelligent chatbots then PVA for teams, as the Teams Dataverse environments don't allow AI Builder or Custom Connectors. Fortunately, there is a work around, and now an easy way to leverage Chat GPT functionality.

chose pva

There are a couple of caveats:

  • The solution isn't great from a security side
  • Although free at the moment there is a usage cost
  • Its an Azure GPT model not OpenAi (but little difference)

There are the following key sections to the solution

  1. Front End
  2. GPT integration
  3. Data Modelling
  4. End to End Solution

1. Front End

As stated we are going with Teams PVA, but this could easily be swapped out with a Canvas Power App (if wanted to embed into SharePoint site etc).

canvas app chat gpt

2. GPT integration

There are lots of ways to add GPT integrations (Premium connectors, custom connectors, http connector), but the easiest is the new Create text with GPT AI Builder connector.

create text with GPT

We can call the connector in a flow and pass the data back to the chatbot (or Power App if you go that way).
I've written a full blog about how to use it and how cool it is here

3. Data Modelling

The only issue with the connector (and ChatGPT for that matter) is the token (character limit), there is no documentation yet on the connectors limit (OpenAI limit is 2048), but it means we need to be creative when we are analysing large documents of data.

This is where a little work is required, we need to transfer knowledge article data into a database/list (SharePoint works great). It's a pain as most business guides etc are in pdf or site pages, but as an overall strategy it's better in the long run for data accuracy and maintenance.

Now we have our data indexed with can filter the list to get the context, the challenge is how do we know what to query for. Luckly we already have the answer, the GPT model can summarize the key words from the question (so we use the questions as the context). We then filter the text to see if it contains the key words. During testing I found that the sweet spot was 3 words but ignore the third (as sometimes this would be a more related word then required).

key words

We then loop over all the filtered items and append to a string variable.
I also had a fall back that would return any of the key words, so the flow would filter by both key words, else if non returned then by either key word.

question
note I found the 'Fail threshold and action' parameter had unexpected impacts so I removed it

4. End to End Solution

Our end to end solution looks like this:

chatbot process

But we have now one more issue to handle, and this is where it gets a little messy (from a security side), how do we use the AI Builder in Dataverse for Teams when it's not allowed.

The trick is we will need to pass our request to a different environment flow, to do this we need the HTTP actions:

  • HTTP - to call and receive the data from the different environment flow
  • When a HTTP request is received - to trigger the different environment flow

Security issue one is the When a HTTP request is received end point is open, so anyone could call it.

There are a couple of ways to secure it, but to keep it simple I went with an API key in the header.

http request

In this case the API key is called 'secretkey' and its value is 'secretValueToRunThisFlow'.

In the When a HTTP request is received trigger we then have a trigger condition to check the key

trigger conditions

The second security issue is the secretkey value, it's not stored in an Azure key vault, so anyone with access to the flows could see it in plain text. The logs would also show it unless we secure inputs for both.

chatbot process 2

If security isn't happy then we can always fall back to replacing PVA with a Canvas App (no need for the HTTP hops anymore).

PVA Flow
PVA Flow

GPT Flow
GPT Flow

PVA

pva
The PVA topic should be the default or called by the greeting

And there you have it, a fully-fledged chatbot in teams, for free(ish). Whats cool is you could store all your knowledge data in one list with an additional chatbot field. Then different chatbots could call the same parent flow which just filters the data for it.


Additional Requirements
As I didn't want to go into too much detail I didn't include exception handling for:

  • No topics found
  • Escalation for incorrect answer

Top comments (1)

Collapse
 
jaloplo profile image
Jaime López

Great article!!!! Superb!!!!