TL;DR
In this tutorial, we will build an AI-powered Q&A bot for your website documentation.
🌐 Create a user-friendly Next.js app to accept questions and URLs
🔧 Set up a Wing backend to handle all the requests
💡 Incorporate @langchain for AI-driven answers by scraping and analyzing documentation using RAG
🔄 Complete connection between frontend input and AI-processed responses.
What is Wing?
Wing is an open-source framework for the cloud.
It allows you to create your application's infrastructure and code combined as a single unit and deploy them safely to your preferred cloud providers.
Wing gives you complete control over how your application's infrastructure is configured. In addition to its easy-to-learn programming language, Wing also supports Typescript.
In this tutorial, we'll use TypeScript. So, don't worry—your JavaScript and React knowledge is more than enough to understand this tutorial.
Building the frontend with Next.js
Here, you’ll create a simple form that accepts the documentation URL and the user’s question and then returns a response based on the website's data.
First, create a folder containing two sub-folders - frontend
and backend
. The frontend
folder contains the Next.js app, and the backend
folder is for Wing.
mkdir qa-bot && cd qa-bot
mkdir frontend backend
Within the frontend
folder, create a Next.js project by running the following code snippet:
cd frontend
npx create-next-app ./
Copy the code snippet below into the app/page.tsx
file to create the form that accepts the user’s question and the documentation URL:
"use client";
import { useState } from "react";
export default function Home() {
const [documentationURL, setDocumentationURL] = useState<string>("");
const [question, setQuestion] = useState<string>("");
const [disable, setDisable] = useState<boolean>(false);
const [response, setResponse] = useState<string | null>(null);
const handleUserQuery = async (e: React.FormEvent) => {
e.preventDefault();
setDisable(true);
console.log({ question, documentationURL });
};
return (
<main className='w-full md:px-8 px-3 py-8'>
<h2 className='font-bold text-2xl mb-8 text-center text-blue-600'>
Documentation Bot with Wing & LangChain
</h2>
<form onSubmit={handleUserQuery} className='mb-8'>
<label className='block mb-2 text-sm text-gray-500'>Webpage URL</label>
<input
type='url'
className='w-full mb-4 p-4 rounded-md border text-sm border-gray-300'
placeholder='https://www.winglang.io/docs/concepts/why-wing'
required
value={documentationURL}
onChange={(e) => setDocumentationURL(e.target.value)}
/>
<label className='block mb-2 text-sm text-gray-500'>
Ask any questions related to the page URL above
</label>
<textarea
rows={5}
className='w-full mb-4 p-4 text-sm rounded-md border border-gray-300'
placeholder='What is Winglang? OR Why should I use Winglang? OR How does Winglang work?'
required
value={question}
onChange={(e) => setQuestion(e.target.value)}
/>
<button
type='submit'
disabled={disable}
className='bg-blue-500 text-white px-8 py-3 rounded'
>
{disable ? "Loading..." : "Ask Question"}
</button>
</form>
{response && (
<div className='bg-gray-100 w-full p-8 rounded-sm shadow-md'>
<p className='text-gray-600'>{response}</p>
</div>
)}
</main>
);
}
The code snippet above displays a form that accepts the user’s question and the documentation URL and logs them to the console for now.
Perfect! 🎉You’ve completed the application's user interface. Next, let’s set up the Wing backend.
How to set up Wing on your computer
Wing provides a CLI that enables you to perform various Wing actions within your projects.
It also provides VSCode and IntelliJ extensions that enhance the developer experience with features like syntax highlighting, compiler diagnostics, code completion and snippets, and many others.
Before we proceed, stop your Next.js development server and install the Wing CLI by running the code snippet below in your terminal.
npm install -g winglang@latest
Run the following code snippet to ensure that the Winglang CLI is installed and working as expected:
wing -V
Next, navigate to the backend
folder and create an empty Wing Typescript project. Ensure you select the empty
template and Typescript as the language.
wing new
Copy the code snippet below into the backend/main.ts
file.
import { cloud, inflight, lift, main } from "@wingcloud/framework";
main((root, test) => {
const fn = new cloud.Function(
root,
"Function",
inflight(async () => {
return "hello, world";
})
);
});
The main()
function serves as the entry point to Wing.
It creates a cloud function and executes at compile time. The inflight
function, on the other hand, runs at runtime and returns a Hello, world!
text.
Start the Wing development server by running the code snippet below. It automatically opens the Wing Console in your browser at http://localhost:3000
.
wing it
You've successfully installed Wing on your computer.
How to connect Wing to a Next.js app
From the previous sections, you've created the Next.js frontend app within the frontend
folder and the Wing backend within the backend
folder.
In this section, you'll learn how to communicate and send data between the Next.js app and the Wing backend.
First, install the Wing React library within the backend folder by running the code below:
npm install @winglibs/react
Next, update the main.ts
file as shown below:
import { main, cloud, inflight, lift } from "@wingcloud/framework";
import React from "@winglibs/react";
main((root, test) => {
const api = new cloud.Api(root, "api", { cors: true })
;
//👇🏻 create an API route
api.get(
"/test",
inflight(async () => {
return {
status: 200,
body: "Hello world",
};
})
);
//👉🏻 placeholder for the POST request endpoint
//👇🏻 connects to the Next.js project
const react = new React.App(root, "react", { projectPath: "../frontend" });
//👇🏻 an environment variable
react.addEnvironment("api_url", api.url);
});
The code snippet above creates an API endpoint (/test
) that accepts GET requests and returns a Hello world
text. The main
function also connects to the Next.js project and adds the api_url
as an environment variable.
The API URL contained in the environment variable enables us to send requests to the Wing API route. How do we retrieve the API URL within the Next.js app and make these requests?
Update the RootLayout
component within the Next.js app/layout.tsx
file as done below:
export default function RootLayout({
children,
}: Readonly<{
children: React.ReactNode;
}>) {
return (
<html lang='en'>
<head>
{/** ---👇🏻 Adds this script tag 👇🏻 ---*/}
<script src='./wing.js' defer />
</head>
<body className={inter.className}>{children}</body>
</html>
);
}
Re-build the Next.js project by running npm run build
.
Finally, start the Wing development server. It automatically starts the Next.js server, which can be accessed at http://localhost:3001
in your browser.
You've successfully connected the Next.js to Wing. You can also access data within the environment variables using window.wingEnv.<attribute_name>
.
Processing user's requests with LangChain and Wing
In this section, you'll learn how to send requests to Wing, process these requests with LangChain and OpenAI, and display the results on the Next.js frontend.
First, let's update the Next.js app/page.tsx
file to retrieve the API URL and send user's data to a Wing API endpoint.
To do this, extend the JavaScript window
object by adding the following code snippet at the top of the page.tsx
file.
"use client";
import { useState } from "react";
interface WingEnv {
api_url: string;
}
declare global {
interface Window {
wingEnv: WingEnv;
}
}
Next, update the handleUserQuery
function to send a POST request containing the user's question and website's URL to a Wing API endpoint.
//👇🏻 sends data to the api url
const [response, setResponse] = useState<string | null>(null);
const handleUserQuery = async (e: React.FormEvent) => {
e.preventDefault();
setDisable(true);
try {
const request = await fetch(`${window.wingEnv.api_url}/api`, {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({ question, pageURL: documentationURL }),
});
const response = await request.text();
setResponse(response);
setDisable(false);
} catch (err) {
console.error(err);
setDisable(false);
}
};
Before you create the Wing endpoint that accepts the POST request, install the following packages within the backend
folder:
npm install @langchain/community @langchain/openai langchain cheerio
Cheerio enables us to scrape the software documentation webpage, while the LangChain packages allow us to access its various functionalities.
The LangChain OpenAI integration package uses the OpenAI language model; therefore, you'll need a valid API key. You can get yours from the OpenAI Developer's Platform.
Next, let’s create the /api
endpoint that handle incoming requests.
The endpoint will:
- accept the questions and documentation URLs from the Next.js application,
- load the documentation page using LangChain document loaders,
- split the retrieved documents into chunks,
- transform the chunked documents and save them within a LangChain vector store,
- and create a retriever function that retrieves the documents from the vector store.
First, import the following into the main.ts
file:
import { main, cloud, inflight, lift } from "@wingcloud/framework";
import { ChatOpenAI, OpenAIEmbeddings } from "@langchain/openai";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { createStuffDocumentsChain } from "langchain/chains/combine_documents";
import { CheerioWebBaseLoader } from "@langchain/community/document_loaders/web/cheerio";
import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";
import { MemoryVectorStore } from "langchain/vectorstores/memory";
import { createRetrievalChain } from "langchain/chains/retrieval";
import React from "@winglibs/react";
Add the code snippet below within the main()
function to create the /api
endpoint:
api.post(
"/api",
inflight(async (ctx, request) => {
//👇🏻 accept user inputs from Next.js
const { question, pageURL } = JSON.parse(request.body!);
//👇🏻 initialize OpenAI Chat for LLM interactions
const chatModel = new ChatOpenAI({
apiKey: "<YOUR_OPENAI_API_KEY>",
model: "gpt-3.5-turbo-1106",
});
//👇🏻 initialize OpenAI Embeddings for Vector Store data transformation
const embeddings = new OpenAIEmbeddings({
apiKey: "<YOUR_OPENAI_API_KEY>",
});
//👇🏻 creates a text splitter function that splits the OpenAI result chunk size
const splitter = new RecursiveCharacterTextSplitter({
chunkSize: 200, //👉🏻 characters per chunk
chunkOverlap: 20,
});
//👇🏻 creates a document loader, loads, and scraps the page
const loader = new CheerioWebBaseLoader(pageURL);
const docs = await loader.load();
//👇🏻 splits the document into chunks
const splitDocs = await splitter.splitDocuments(docs);
//👇🏻 creates a Vector store containing the split documents
const vectorStore = await MemoryVectorStore.fromDocuments(
splitDocs,
embeddings //👉🏻 transforms the data to the Vector Store format
);
//👇🏻 creates a document retriever that retrieves results that answers the user's questions
const retriever = vectorStore.asRetriever({
k: 1, //👉🏻 number of documents to retrieve (default is 2)
});
//👇🏻 creates a prompt template for the request
const prompt = ChatPromptTemplate.fromTemplate(`
Answer this question.
Context: {context}
Question: {input}
`);
//👇🏻 creates a chain containing the OpenAI chatModel and prompt
const chain = await createStuffDocumentsChain({
llm: chatModel,
prompt: prompt,
});
//👇🏻 creates a retrieval chain that combines the documents and the retriever function
const retrievalChain = await createRetrievalChain({
combineDocsChain: chain,
retriever,
});
//👇🏻 invokes the retrieval Chain and returns the user's answer
const response = await retrievalChain.invoke({
input: `${question}`,
});
if (response) {
return {
status: 200,
body: response.answer,
};
}
return undefined;
})
);
The API endpoint accepts the user’s question and the page URL from the Next.js application, initialises ChatOpenAI
and OpenAIEmbeddings
, loads the documentation page, and retrieves the answers to the user’s query in the form of documents.
Then, splits the documents into chunks, saves the chunks in the MemoryVectorStore
, and enables us to fetch answers to the question using LangChain retrievers.
From the code snippet above, the OpenAI API key is entered directly into the code; this could lead to security breaches, making the API key accessible to attackers. To prevent this data leak, Wing allows you to save private keys and credentials in variables called secrets
.
When you create a secret, Wing saves this data in a .env
file, ensuring it is secured and accessible.
Update the main()
function to fetch the OpenAI API key from the Wing Secret.
main((root, test) => {
const api = new cloud.Api(root, "api", { cors: true });
//👇🏻 creates the secret variable
const secret = new cloud.Secret(root, "OpenAPISecret", {
name: "open-ai-key",
});
api.post(
"/api",
lift({ secret })
.grant({ secret: ["value"] })
.inflight(async (ctx, request) => {
const apiKey = await ctx.secret.value();
const chatModel = new ChatOpenAI({
apiKey,
model: "gpt-3.5-turbo-1106",
});
const embeddings = new OpenAIEmbeddings({
apiKey,
});
//👉🏻 other code snippets & configurations
);
const react = new React.App(root, "react", { projectPath: "../frontend" });
react.addEnvironment("api_url", api.url);
});
- From the code snippet above,
- The
secret
variable declares a name for the secret (OpenAI API key). - The
lift().grant()
grants the API endpoint access to the secret value stored in the Wing Secret. - The
inflight()
function accepts the context and request object as parameters, makes a request to LangChain, and returns the result. - Then, you can access the
apiKey
using thectx.secret.value()
function.
- The
Finally, save the OpenAI API key as a secret by running this command in your terminal.
Congratulations! You've successfully completed the project for this tutorial.
Here is a brief demo of the application:
Let's dig a little bit deeper into the Wing docs to see what data our AI bot can extract.
Wrapping It Up
So far, we have gone over the following:
- What is Wing?
- How to use Wing and query data using Langchain,
- How to connect Wing to a Next.js application,
- How to send data between a Next.js frontend and a Wing backend.
Wing aims to bring back your creative flow and close the gap between imagination and creation. Another great advantage of Wing is that it is open-source. Therefore, if you are looking forward to building distributed systems that leverage cloud services or contribute to the future of cloud development, Wing is your best choice.
Feel free to contribute to the GitHub repository, and share your thoughts with the team and the large community of developrs.
The source code for this tutorial is available here.
Thank you for reading! 🎉
Top comments (19)
Cool.
Thank you!
Top notch as always!
Thank you, Jake, for your support!
Nice car!
Haha thanks Nevo, I'm glad you like it :)
Great article, Nathan.
Thanks so much Bonnie!
Loved this article
Thanks, Ayush!
Very useful, thanks!
Thank you!
Great article, I've been looking for something to build with Langchain
The next section does not execute:
"Finally, start the Wing development server. It automatically starts the Next.js server, which can be accessed at localhost:3001 in your browser."
I think it's very important. You can build it when you want to explain something bought new (drugs, devices, foods...) to somebody. I mean it can be interesting. Not so ?
I'm not sure I'm following.
I want to say that it can be useful when you are willing to explain/enlighten knowledges about how a new service/product provided to customers, can be used. A kind of tutorials.
Yes, I always appreciate that myself as well, thank you for the feedback.
Imagine your customers buy AIDS drugs. It's first time for using them. With some additional features, it will amazingly be a tool to concretely explain how to harness drugs.