DEV Community

Cover image for I was tired of manual research and web design, so I built an AI bot for it πŸ€–βœ¨
Sunil Kumar Dash for Composio

Posted on

I was tired of manual research and web design, so I built an AI bot for it πŸ€–βœ¨

TL;DR

I have been juggling many tasks lately, which leaves me little time for researching what’s new in AI and tech in general.

So, I built this AI bot to research the internet, refine the content, and render the information on a static website.

This is how it works overall,

  • Take natural language instructions
  • Search internet
  • Update existing boilerplate codes to reflect changes on the website.

coding is fun GIF


Composio - Your friendly neighbourhood AI platform ✨

If you are building an AI (LLM)- powered application, you will eventually need to integrate external applications with AI models to automate workflows.

For instance, ChatGPT integrates web search to give up-to-date information, Dall-e to generate images, and a code interpreter to execute code on the fly. The model routes the requests to appropriate tools based on user prompts.

You can do the same using Composio but with 100+ applications and specialized tools. Building applications

Guy struggling GIF

Please help us with a star. πŸ₯Ή

It would help us to create more articles like this πŸ’–

Star the Composio repository ⭐


How does it work?

The project has two parts.

  • Set up a project with npm where the AI tool auto-updates the HTML and CSS files.
  • Configure the agents with web and coding tools, as well as Composio tools and CrewAI agents.

Workflow Overview

  • First, set up a project and run the npm server.
  • Configure Composio and integrate tools with the CrewAI agents.
  • Set up a Streamlit frontend to interact with the agents.
  • When instructions are passed, the agent crew springs into action. If required, it collects information from the web and updates the index.html file accordingly.

Let’s get started! πŸ”₯

First, set up your fronted environment. We are keeping it short and only editing a single file for brevity. Do comment if you need a better version of this with end-to-end coding capabilities.

Create an npm package.

npm init
Enter fullscreen mode Exit fullscreen mode

Now, set up your project using this package.json file.

{
  "name": "frontend-project",
  "version": "1.0.0",
  "description": "AI generated project",
  "main": "index.js",
  "scripts": {
    "build": "tailwindcss -i ./src/input.css -o ./dist/output.css --watch",
    "serve": "http-server -p 3000",
    "start": "npm-run-all --parallel build serve"
  },
  "author": "sunilkumardash",
  "license": "ISC",
  "devDependencies": {
    "tailwindcss": "^3.4.12",
    "autoprefixer": "latest",
    "http-server": "latest",
    "npm-run-all": "^4.1.5",
    "postcss": "latest"
  }
}
Enter fullscreen mode Exit fullscreen mode

This is our front-end structure.

.
β”œβ”€β”€ backup_index.html
β”œβ”€β”€ dist
β”‚   β”œβ”€β”€ output.css
β”‚   └── styel.css
β”œβ”€β”€ index.css
β”œβ”€β”€ index.html
β”œβ”€β”€ package.json
β”œβ”€β”€ package-lock.json
β”œβ”€β”€ src
β”‚   └── input.css
└── tailwind.config.js
Enter fullscreen mode Exit fullscreen mode

Write a boilerplate code inside the index.html code.

<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <title>Website generator</title>
    <link href="./dist/output.css" rel="stylesheet">
</head>
<body class="bg-gray-100">
    <div class="container mx-auto px-4 py-8">
        <h1 class="text-4xl font-bold text-center text-blue-600 mb-8">
            Website generator! 
        </h1>
        <div class="bg-white shadow-md rounded-lg p-6">
            <p class="text-2xl text-indigo-600 mb-6 font-serif italic font-bold">
                Prepare yourself for a delightful website, coming your way momentarily...
            </p>
        </div>
    </div>
</body>
</html>
Enter fullscreen mode Exit fullscreen mode

Create a Tailwind config file.

npx tailwindcss init
Enter fullscreen mode Exit fullscreen mode

Configure Tailwind.

/** @type {import('tailwindcss').Config} */
module.exports = {
  content: ["./*.html"],
  theme: {
    extend: {},
  },
  plugins: [],
} 
Enter fullscreen mode Exit fullscreen mode

Create and update a CSS file.

@tailwind base;
@tailwind components;
@tailwind utilities;
Enter fullscreen mode Exit fullscreen mode

Now, run the server with npm.

npm start
Enter fullscreen mode Exit fullscreen mode

Building the AI Tool

Before going forward, create a virtual environment.

python -m venv code-agent

cd code-agent

source bin/activate
Enter fullscreen mode Exit fullscreen mode

Install the following libraries.

pip install composio-sore
pip install crewai composio-crewai
pip install streamlit
pip install python-dotenv
Enter fullscreen mode Exit fullscreen mode

Next, Create aΒ .envΒ file and add environment variables for the OpenAI API key.

OPENAI_API_KEY=your API key
Enter fullscreen mode Exit fullscreen mode

To create an OpneAI API key, go to the official site and create an API key in the dashboard.

OpenAI Key

Set Up Composio

Now, set up Composio to access all the necessary tools, such as file tools for reading and writing into files, Exa for internet search, and a browser tool for clicking screenshots of websites.

First, log in to your account by running the following command.

composio login
Enter fullscreen mode Exit fullscreen mode

This will redirect you to login/signup to Composio.

Composio Login

Upon logging in, a screen with a key will appear.

Auth key

Copy it and paste it into the terminal.

Now, update apps.

composio apps update
Enter fullscreen mode Exit fullscreen mode

Integrate Exa with Compsio.

composio add exa
Enter fullscreen mode Exit fullscreen mode

You will be asked to provide the API key, which you can get from their official page.

Exa API

Now, you are ready to move to the coding part.

Import libraries and define LLM instance

Import required libraries and load environment variables from the .env file.

from crewai import Agent, Task, Crew, Process
from langchain_openai import ChatOpenAI
from composio_crewai import ComposioToolSet, Action, App
import dotenv
import os

dotenv.load_dotenv() 
Enter fullscreen mode Exit fullscreen mode

Define the OpenAI LLM instance.

# add OPENAI_API_KEY to env variables.
llm = ChatOpenAI(model_name="gpt-4o", api_key=os.getenv("OPENAI_API_KEY"))
Enter fullscreen mode Exit fullscreen mode

Define the variable using paths to the index.html and input.css files.

index_file = "/home/sunil/Documents/Composio/frontend-codegen/frontend-project/index.html"
input_css = "/home/sunil/Documents/Composio/frontend-codegen/frontend-project/src/input.css"
Enter fullscreen mode Exit fullscreen mode

Define Agents and Tasks

First, define the toolsets for file operations and Exa.

def convert_structured_data_into_webpage(query):
    # get the structured data from the user
    # use crewai to edit the index.html file

    # Get All the tools
    composio_toolset = ComposioToolSet()
    exa_tools = composio_toolset.get_tools(apps=[App.EXA])
    file_tools = composio_toolset.get_tools(apps=[App.FILETOOL],actions=[Action.BROWSER_TOOL_GET_SCREENSHOT])
Enter fullscreen mode Exit fullscreen mode

Next, define CrewAI agents for search and web editing.

search_agent = Agent(
        role="Search Agent",
        goal="Search online, gather information to create a humorous but professional critique of the term",
        backstory=(
            """You are a professional information gatherer. 
            The information you provide will be used to design a website that highlights the quirks and 
            peculiarities of the search term in a lighthearted, professional manner. Try to gather pros/cons or good/bad information from different angles and sources and provide all of this information excellently that could be used to create a website.
            Try to send information in a structured format, such as "Here are all the pros," "Here are all the cons," "Here's the long-term impact," "Here's the short-term impact," etc.
            It is divided into different categories and provides all the information in a way that could be used to create a website.
            The idea is to provide website designers with ideas on modifying the website.
            """
        ),
        verbose=True,
        tools=exa_tools,
        llm=llm,
    )

    # Define agent
    web_editor_agent = Agent(
        role="Web Humor Editor",
        goal=f"Modify index.html to create a well-designed, humorous website based on the information provided",
        backstory=(
            f"""You are a professional AI agent specialized in web development with a talent for creating 
            entertaining content. Your task is to edit the index.html file to create a well-structured, 
            visually appealing website.

            You have a live server running on top of index.html file at {index_file}
            Modify the {index_file} file to showcase the humorous critique in a 
            visually appealing and well-organized way. Use Tailwind CSS classes effectively to create a 
            responsive and attractive layout. You can edit the CSS in {input_css}
            file if needed, but prioritize using Tailwind classes for styling.

            Incorporate relevant images, use appropriate and readable fonts, implement a pleasing colour scheme, 
            and include design elements that contribute to the humour while maintaining a professional look. 
            Your goal is to create a website that's both informative in its critique and visually engaging, 
            perfectly balancing humour and good design principles.
            """
        ),
        verbose=True,
        tools=file_tools,
        llm=llm,
    )
Enter fullscreen mode Exit fullscreen mode

In the above code blocks,

  • we have defined two agents. Each has a role, goal, and backstory.
  • These provide additional context for task execution to the LLMs.
  • Each agent has appropriate tools.

Define tasks

Now, define tasks that the agents will accomplish.

axis_task = Task(
        description=f"Identify 3-5 distinct axes or categories along which we can present information about the topic: {query}. These axes should provide a comprehensive and interesting perspective on the subject.",
        agent=search_agent,
        expected_output=f"A list of 3-5 axes or categories, each separated by a double line break.",
    )

    search_task = Task(
        description=f"Search the website provided to gather detailed information about {query}. For each axis, collect quirky facts, peculiarities, and interesting information that highlights both positive and negative aspects.",
        agent=search_agent,
        expected_output=f"Detailed information for each axis, separated by double line breaks. Each section should start with the axis name in bold. Pass on the links to the information you find along with the axis name.",
    )

    web_editor_task = Task(
        description="""Edit index.html to incorporate the provided information into a well-designed, comprehensive website based on the information provided. Ensure it's visually appealing, well-structured, and uses Tailwind CSS classes effectively. Try to keep it pretty and professional. 
        You can add links to end pages to allow users to learn more about the topic by navigating to other websites. Try to organise the information along the axes provided.
        """,
        agent=web_editor_agent,
        expected_output="A professionally designed website with a comprehensive critique of the term. Include relevant images and appropriate fonts that enhance the humor while maintaining readability.",
    )
Enter fullscreen mode Exit fullscreen mode

In the above code block,

  • We have defined three tasks: an axis task for finding different angles to research, a Search task for searching data, and a web editor for writing codes.
  • Each task has descriptions, respective agents, and expected outputs.

Now, define the Crew.

my_crew = Crew(
        agents=[search_agent, web_editor_agent],
        tasks=[search_task, web_editor_task],
        process=Process.sequential,
    )

    result = my_crew.kickoff()
    print(result)
Enter fullscreen mode Exit fullscreen mode

Finally, all the agents and tasks are added to create the Crew. The sequential says the tasks will be executed sequentially one after the other.

Create a Streamlit Front end.

Spin up a simple Streamlit from the end for a nice interactive interface.

In a new main.py, paste the following codes.

import streamlit as st
# from get_data_from_exa import search_ai_startups
# from structure_data_using_claude import structure_data_using_claude
from code_agent import convert_structured_data_into_webpage
import shutil
import sys
import time

def main():
    st.set_page_config(page_title="Website Generator", page_icon="🌐")
    st.title("Website Generator")

    # Copy backup file
    shutil.copy('/home/sunil/Documents/Composio/frontend-codegen/frontend-project/backup_index.html', '/home/sunil/Documents/Composio/frontend-codegen/frontend-project/index.html')

    # Input field for query
    query = st.text_input("Enter a query:")

    if st.button("Generate Website"):
        if query:
            with st.spinner("Generating website..."):
                try:
                    crew_run = convert_structured_data_into_webpage(query)
                    st.success("Website modification completed!")
                except Exception as e:
                    st.error(f"An error occurred: {str(e)}")
        else:
            st.warning("Please enter a query.")

if __name__ == "__main__":
    main()
Enter fullscreen mode Exit fullscreen mode

Once everything is done, run the Streamlit app.

streamlit python main.py
Enter fullscreen mode Exit fullscreen mode

You will be redirected to the web interface on your local host.

Streamlit UI

Pass in the website name and click on the Generate website. The crew of agents will spring into action and update the HTML and CSS files to render a beautiful website on the node server.

I asked it to critique Composio's Blog page. This is what I got.

Blog Critique

Complete Code: GitHub

Next Steps

In this article, you built a complete AI tool that researches the web for information and creates an excellent website that presents it.

If you liked the article, explore and star the Composio repository for more AI use cases.

Β 
Β 

star the repo
Star the Composio repository ⭐

Β 
Β 

Thank you for reading the article!

Top comments (0)