DEV Community

Trieu Le
Trieu Le

Posted on

AI Chatbot with Cloudflare Worker AI Model and Vercel AI SDK

This is a submission for the Cloudflare AI Challenge.

What I Built

A simple chatbot app using llama-2-7b-chat-fp16 model from Cloudflare Worker and Vercel AI SDK

Demo

AI Chatbot

My Code

Github project

Journey

Just a simple chatbot using Next.js, the prebuilt utility useChat from the Vercel AI SDK to handle streaming responses of Cloudflare Worker, and utilizing the prebuilt chat bubble component from DaisyUI for the UI.

Tech Stack

Demo photo

Top comments (9)

Collapse
 
ranjancse profile image
Ranjan Dailata

Great demo :)

It's fascinating that you have used the "Vercel AI SDK" for the streaming purposes.

Collapse
 
freddyrangel profile image
Freddy Rangel

I wasn't even aware there was a Vercel AI SDK. I wish I knew that a few months ago

Collapse
 
salika_dave profile image
Salika Dave

Nice work!

Collapse
 
axiol profile image
Arnaud Delante

Why did you use DaisyUI when the Vercel AI SDK comes with shadcn?

Collapse
 
trieule profile image
Trieu Le • Edited

Shadcn does not have a chat bubble component like daisyUI, and as a mobile developer, I’m not very proficient in web UI, so I prefer to choose components available in daisyUI.

Collapse
 
cheraff profile image
cheraff

Nice work Trieu! I tried to run a local version and ran into some issues.

Opened an issue on github.

Thread Thread
 
trieule profile image
Trieu Le

It is a breaking change for StreamingTextResponse in Vercel AI SDK 3.0.20. . I've fixed this issue in a new commit.

Thread Thread
 
cheraff profile image
cheraff

Wow, quick fix thank you! Just got it working.

In preview mode "npm run preview" the response isn't being streamed, there is a delay and it's returned onFinished (feels like REST)

Is this expected behaviour? Would be nice to see it stream locally in dev mode

Thread Thread
 
trieule profile image
Trieu Le

I also encountered a similar issue when running in preview mode.