DEV Community

Cover image for Build a Serverless Discord Bot with OpenAI GPT-3 in Minutes using Template Functions in Lolo
Lolo Code
Lolo Code

Posted on

Build a Serverless Discord Bot with OpenAI GPT-3 in Minutes using Template Functions in Lolo

Leverage the Low Code part of Lolo Code to build a Discord Bot that will be able to receive messages and respond to messages with OpenAI’s GPT-3 in a few minutes.

Image description

This article is a response to a user that wanted us to show how to quickly build a Serverless Discord bot that will be able to handle and create messages using OpenAI’s GPT-3. We’ll be using a library function we’ve set up so there will be minimal code involved here. You’ll be able to customize intents and partials as you wish though, by configuring the Discord Socket function in Lolo.

See the Lolo workflow below.

Image description

The process will be about 5 or so minutes. Let’s start.

Create your Discord Bot

Get started by going to https://discord.com/developers/applications and creating a new application. You’ll need have a Discord Server before doing this, so do create one if you don’t have one yet.

Image description

Go to the Bot tab and click ´Create a bot´. Once your bot has been created, you’ll need to set some Privileged Gateway Intents by scrolling downwards.

Image description

After you’ve saved do Reset Token and then save your token somewhere safe. We’ll need it in a bit.

Now navigate to OAuth2 and then URL Generator. Set Bot under Scope and then Send Messages and then Read Messages/View Channels as Bot Permissions.

Image description

Copy the URL that has been generated for you at the bottom. Paste it into your browser. This is how you’ll authorize your application.

Image description

Great. We’re done here.

Create an Application & add the Discord Socket

Go to Lolo Code and create a new application. You can create a free account on the same page if you don’t have an account.

You can name the application whatever you want.

Once you have an empty graph in front of you, go to the Library Function’s menu to the left. Look for Discord Socket there and add it once you find it.

Image description

You always open up functions by double clicking them. The Discord Socket needs to be configured with the token we got earlier.

We also need to set Operation → Messages.

To be able to read DMs to the bot we need to set some Intents and Partials as well. Set DirectMessages, GuildMessages and MessageContent as Intents. You can set Guilds as well. Make sure you also set Channel as Partials. See our config below.

Image description

What you set here dictates the messages your socket will receive from Discord. You can read up on Intents here and Partials here.

Process Incoming Messages from Discord

Now that we’ve set up the Discord socket we’ll receive messages depending on the intents we set. For this one, we’ll receive messages from any channel.

Depending on what messages you would like to respond to you may want to first just log the data the message comes with in a new function.

So, create a new function and paste in this code into it.

exports.handler = async(ev, ctx) => {
  const { log } = ctx;

  // you'll have access to the client as well as the interaction in the event object
  log.info("incoming event data: ", JSON.stringify(ev.interaction))

};
Enter fullscreen mode Exit fullscreen mode

This code will log the interaction data to the Logs in your Lolo application.

You can connect the Discord Socket to the new function now and then Save → Configure → Advanced Settings → Set Node V.16 → Update. This should deploy your application with Node version 16.

See the clip below on how we do this.

Image description

Please wait up to a minute before you see your Logs coming in when you first deploy with new dependencies.

Once it has been deployed, you can write something in a private message to your bot in Discord and see the event come in in your Logs in Lolo.

Image description

You may want to look into this event data to see what you want to filter with. I will be filtering out messages that are coming from this channelid and then I will filter out any messages made by the bot itself. I.e. I only want the message that comes from a direct DM and is not made by the bot itself.

So open up the new function again, remove the previous code and paste in this code. You will have to tweak two values in it, as you need to set the your own channelId and authorId.

exports.handler = async(ev, ctx) => {
  const { route, log } = ctx;

  // you'll have access to the client as well as the interaction in the event object
  log.info("incoming event data: ", JSON.stringify(ev.interaction))

  // parse interaction data to extract channel and author
  const interaction = JSON.parse(JSON.stringify(ev.interaction));

  // filter the message - in this case the channelID of the DM and do not respond if author is bot with this id
  // REMEMBER TO CHANGE IDS
  if(interaction.channelId === 'CHANNELID HERE' && interaction.authorId !== 'BOT AUTHOR ID HERE') {
    ev.message = interaction.content;
    route(ev, 'out')
  }

};
Enter fullscreen mode Exit fullscreen mode

I have already logged a message from the bot to get that interaction.authorId but you may want to start only with filtering interaction.channelid and some other identifier. You have access to the message content itself in interaction.content that you can use instead.

Let’s make an API call to OpenAI now.

Make an OpenAI API Call with the Message

If you look at the above code, we’re setting ev.message with the text content here when the message passes that if statement. We’ll use this in the next function now.

Open up your Library Function’s menu again and look for Lolo/OpenAI Completions. Once you find it, click it to add it on. Open it up to configure it.

We’ll need an API key here for this. So go get it in your OpenAI account. If you don’t have an account yet, do create one and then create a new key here.

Set prompt as {event.message} and then the model as text-davinci-003. Also set some additional options, temperature as 0.3 and max_tokens as 1500. Tick in the Store Prompt History, if you want a feeling of a conversation with your bot. This is optional.

Image description

You may tweak this as you’d like. It is simply making an API call. However, you don’t want to use too high of a value for max_tokens or you may end up with 400 errors.

We’re done here so we can set up the response we’ll send back to Discord now in another new function.

Reply to the Message with the OpenAI Response

Go back to your graph and create another new function.

Paste in this code into its code handler.

exports.handler = async(ev) => {
  // OpenAI response
  const aiResponse = ev.response.data.choices[0].text;

  // respond to discord message
  ev.interaction.reply(aiResponse);
};
Enter fullscreen mode Exit fullscreen mode

Rename it Reply and remove the out port.

Image description

Remember to connect all your functions with the correct ports. See the finished workflow below.

Image description

Save your application. Now go to configure instead of Run, and set Node version 16 under Advanced Settings. After this click Update. This will run your application with node version 16.

Image description

You shouldn’t have to pick node version each time, we’re fixing this with the new IDE for the future. But, unfortunately you’ll have to set this each time you re-deploy as it is now.

Wait about a minutes to see your Logs come in. It should log ‘bot live’ when it is running.

Test your GPT-3 AI Discord Bot

Now go back to Discord and DM your bot with some message to test it out.

The result should look like the conversation below.

Image description

Try out whatever with it. We haven’t tested everything yet and it goes without saying you should be able to do more advanced stuff with this.

Join us in discord to talk with others or get help from the team.

❤ Lolo

Top comments (0)