Today, I'll guide you through integrating an open source LLM model into a WordPress website using Hexabot—a powerful, open source AI conversational builder that lets you create chatbots or AI agents. If you love open source projects, consider giving Hexabot a star on GitHub and exploring our documentation. Let’s get started!
Step 1: Install Hexabot CLI
First, you'll need to install the Hexabot CLI globally. You can do this easily using npm:
npm install -g hexabot-cli
Next, create a new Hexabot project by running the following command:
hexabot create wordpress-chatbot
This will generate a folder containing your chatbot project. Navigate to the new project directory:
cd wordpress-chatbot
Install the necessary dependencies by running:
npm install
Before proceeding, make sure you have Node.js and Docker installed, as Docker is required to run the project.
Step 2: Initialize Hexabot
After installing the dependencies, initialize the environment file "docker/.env":
hexabot init
This command will generate an environment configuration file. Once done, you can launch your project in development mode by running:
hexabot dev
You will also need to enable Ollama, an open source LLM tool, which is integrated by default to run the AI model.
Step 3: Setting Up Ollama
Ollama is an important part of our setup, as it handles the open source LLM model that powers your chatbot. Once you’ve started the services, it’s time to configure Ollama.
To pull the required LLM model (for example, Ollama 3.2), SSH into the container:
docker exec -ti ollama bash
Then pull the model:
ollama pull ollama-3.2
Once the model is pulled, you’re ready to configure the chatbot in the Hexabot admin panel.
Step 4: Accessing the Admin Panel
With the services up and running, open your browser and head to http://localhost:8080
. This will open the Hexabot admin panel.
Navigate to the Visual Editor. Here, you can drag and drop blocks into the canvas to create the conversation flow for your chatbot. Add the Ollama block plugin to enable the AI model response. Configure the plugin settings, such as defining a context like:
"You are an AI assistant working for Etudes, an architecture firm."
Make sure you also pull the appropriate model as instructed in the previous step.
Step 5: Integrate Hexabot with Your WordPress Website
Now, let’s connect this chatbot to your WordPress website. First, navigate to your WordPress project and start your site using Docker Compose:
cd path/to/wordpress
docker-compose up
Access your WordPress site, then go to the Plugins section and search for the Hexabot plugin. Install and activate it.
Step 6: Configure the Chat Widget
Once the Hexabot plugin is activated, go to the Settings to configure the chat widget. You’ll need to set the API URL to:
http://localhost:4000
Ensure the correct token is set, and select the web channel to complete the configuration.
Step 7: Customize the Widget
After adding the widget to your site, you can further customize it by changing its settings. For example, you can adjust the widget title, colors, and other elements to match the branding of your WordPress website. Refresh your website, and you should see the widget live and ready to engage with users.
Try interacting with your chatbot: say "Hello" and watch as Ollama powers the responses, providing the AI experience you configured earlier. You can further tweak the settings to provide a personalized touch.
Conclusion
Congratulations! You've successfully integrated an open source LLM with your WordPress website using Hexabot. This setup allows you to build a powerful and interactive chatbot, perfect for engaging users directly on your WordPress platform.
If you found this guide helpful, please consider starring our Hexabot GitHub repository and getting involved in our community. Hexabot is built by the community, for the community, and we would love for you to join our journey! You can also connect with us on Discord, and check out our YouTube channel for more tutorials and tips.
Thanks for reading, and happy coding!
Top comments (1)
Great guide! Integrating an open-source LLM like Ollama with WordPress is a fantastic way to boost user engagement. I’ve also seen similar benefits with Cloudways—hosting open-source projects can be a breeze with their optimized infrastructure. Thanks for sharing this step-by-step process!