DEV Community

Nari
Nari

Posted on • Updated on

Implement Vercel AI SDK and LangChain JS with Next.js

Introduction

Try to implement a chat page using ChatGPT's API.
I will use Vercel AI SDK, but I also tried LangChain with it.


Prerequisites

  • Experience with Next.js development

Advance preparation

  • Create an API key from OpenAI platform.

https://platform.openai.com/


Sample code

Steps

  • Create a Next.js + Typescript
  • Write OPENAI_API_KEY in env.local
  • Create /api/aiSdkChat.ts and /api/langChainChat.ts and edit index.tsx

Create a Next.js + Typescript

yarn create next-app sample-ai --typescript
cd sample-ai
yarn add ai openai-edge
yarn add langchain
Enter fullscreen mode Exit fullscreen mode

Implementing SDKs

/api/aiSdkChat.ts

import { OpenAIStream, StreamingTextResponse } from "ai";
import { Configuration, OpenAIApi } from "openai-edge";
import { NextRequest } from "next/server";

export const runtime = "edge";

const config = new Configuration({
  apiKey: process.env.OPENAI_API_KEY,
});
const openai = new OpenAIApi(config);

export default async function handler(req: NextRequest) {
  const { messages } = await req.json();

  const response = await openai.createChatCompletion({
    model: "gpt-3.5-turbo",
    stream: true,
    temperature: 0.9,
    messages: messages.map((message: any) => ({
      content: message.content,
      role: message.role,
    })),
  });
  const stream = OpenAIStream(response);
  return new StreamingTextResponse(stream);
}
Enter fullscreen mode Exit fullscreen mode

/api/langChainChat.ts

import { StreamingTextResponse, LangChainStream, Message } from "ai";
import { ChatOpenAI } from "langchain/chat_models/openai";
import { AIMessage, HumanMessage } from "langchain/schema";
import { NextRequest } from "next/server";

export const runtime = "edge";

export default async function handler(req: NextRequest) {
  const { messages } = await req.json();
  const { stream, handlers } = LangChainStream();

  const llm = new ChatOpenAI({
    modelName: "gpt-3.5-turbo",
    streaming: true,
    temperature: 0.9,
  });

  llm
    .call(
      (messages as Message[]).map((m) =>
        m.role == "user"
          ? new HumanMessage(m.content)
          : new AIMessage(m.content)
      ),
      {},
      [handlers]
    )
    .catch(console.error);
  return new StreamingTextResponse(stream);
}
Enter fullscreen mode Exit fullscreen mode

index.tsx
In useChat, specify either Vercel AI SDK or LangChain.

import type { NextPage } from "next";
import { useChat } from "ai/react";

const Home: NextPage = () => {
  const aiSdkChat = `/api/aiSdkChat`;
  const langChainChat = `/api/langChainChat`;

  const { messages, input, isLoading, stop, handleInputChange, handleSubmit } =
    useChat({
      api: aiSdkChat,
    });

  if (!isLoading) console.log(messages);

  return (
    <>
      <div className="mx-auto w-full max-w-md py-24 flex flex-col">
        <p className="font-bold text-lg">ChatGPT</p>
        <br />
        {messages.map((m) => (
          <div key={m.id} className="w-96 mb-2 p-2">
            {m.role === "user" ? "Human: " : "AI: "}
            {m.content}
          </div>
        ))}

        <br />
        <form onSubmit={handleSubmit}>
          <input
            name="box"
            className="w-96 flex rounded bottom-0 border border-gray-300 text-gray-700 mb-2 p-2"
            value={input}
            onChange={handleInputChange}
          />
          {isLoading ? (
            <button
              type="submit"
              className="opacity-50 cursor-not-allowed w-96 rounded bg-sky-500 hover:bg-sky-700 mb-2 p-2"
              disabled
            >
              Send
            </button>
          ) : (
            <button
              type="submit"
              className="w-96 rounded bg-sky-500 hover:bg-sky-700 mb-2 p-2"
            >
              Send
            </button>
          )}
        </form>
        <p className="w-96 text-slate-500 text-xs">
          You can check the value of a message variable from the console of the
          development tool to see how the value is stored.
        </p>
      </div>
    </>
  );
};

export default Home;
Enter fullscreen mode Exit fullscreen mode

Start with yarn dev.
Try to display localhost:3000 in a PC browser.

Image description

Conversation with AI.

Image description

It appears to be working properly.


Conclusion

Thanks to the Vercel AI SDK, it was very easy to implement such a simple chat page.
Adding further error handling would provide an opportunity for better understanding.

LangChain has many interesting features such as AI Argent, which I will try again in the future.

Reference site
https://sdk.vercel.ai/docs/api-reference/use-chat
https://sdk.vercel.ai/docs/api-reference/langchain-stream
https://js.langchain.com/docs/get_started

Top comments (0)