DEV Community

Stephen Collins
Stephen Collins

Posted on

How to use SQLite to store and query vector embeddings

As large language models (LLMs) continue to gain prominence, the concern for data privacy has grown exponentially. Relying on an external vector database could mean entrusting your private information to third-party services, especially when building chatbots.

What if there was a more straightforward, self-managed solution that prioritizes the privacy of both your data and your customers' information? Today, I'm going to explain an example TypeScript project that illustrates precisely how to achieve this by using sqlite-vss, a specialized extension for handling CRUD operations on vector embeddings stored within a SQLite database, all while keeping data control firmly in your hands.

We'll be using TensorFlow.js with the Universal Sentence Encoder to create vector embeddings and SQLite to store and search them. This is an open source and free language model to create vector embeddings, and should be runnable by a large majority of consumer hardware. All of the code in this blog post is available at this GitHub repository.

Overview

This blog post is divided into three major parts:

  1. Embeddings: Setting up and using the Universal Sentence Encoder to create vector embeddings.
  2. Database: Handling SQLite database setup, extensions loading, and table creation.
  3. Main Functionality: Inserting, searching, and managing chat history using vectors.

Prerequisites

You'll need Node.js installed and using version 18 or newer and you'll need the following dependencies installed:

  • TensorFlow's Universal Sentence Encoder
  • SQLite3
  • TypeScript

You can find the required dependencies in the package.json here.

Part 1: Embeddings

Loading Universal Sentence Encoder

We're using the Universal Sentence Encoder (research paper here) to create vector embeddings from raw string input. Here's how the model is loaded in the embed.ts:

import "@tensorflow/tfjs-node"
import * as tf from "@tensorflow/tfjs-node"
import * as use from "@tensorflow-models/universal-sentence-encoder"

let model: use.UniversalSentenceEncoder

// ...

export async function setupEmbeddings() {
  try {
    // waiting until tensorflow is ready
    // before loading the universal sentence encoder model
    // that will create our vector embeddings from raw string input
    await tf.ready()
    model = await use.load()
  } catch (err) {
    console.error("failed to load the model", err)
  }
}
Enter fullscreen mode Exit fullscreen mode

An important note here is that you are not limited to using the Universal Sentence Encoder. As stated in the usage section of the sqlite-vss extension, it supports any embedding or vector data you have.

Creating Embeddings

We're exporting functions that create embeddings for queries and chat messages (still in the embed.ts file):

async function createEmbedding(input: string) {
  const embedding = await model.embed(input)
  // we need to get a serializable array from the output tensor
  const embeddingArray = await embedding.array()
  // the first element of the array is the vector we want to store as a string in our database.
  return JSON.stringify(embeddingArray[0])
}

export async function createQueryEmbedding(query: string) {
  return createEmbedding(query)
}

export async function createMessageEmbedding(
  type: string,
  command: string,
  content: string,
  timestamp: string
) {
  return createEmbedding(
    `type: ${type} command: ${command} content: ${content} timestamp: ${timestamp}`
  )
}
Enter fullscreen mode Exit fullscreen mode

Part 2: Database

Here, we are going over db.ts, which is a small abstraction over handling the details of talking to our SQLite database.

Opening the Database

We open the database with openDatabase:

import { Database, OPEN_READWRITE, OPEN_CREATE } from "sqlite3"
import { createMessageEmbedding, createQueryEmbedding } from "./embed"

const vectorExtensionPath = "./vector0.dylib"
const vssExtensionPathVSS = "./vss0.dylib"
const DB_PATH = "./chat.sqlite"
let db: Database

// create or open the chat.sqlite database
function openDatabase(): Promise<Database> {
  return new Promise((resolve, reject) => {
    const db = new Database(DB_PATH, OPEN_READWRITE | OPEN_CREATE, (err) => {
      if (err) reject(err)
      resolve(db)
    })
  })
}
Enter fullscreen mode Exit fullscreen mode

Loading Extensions

We load the necessary extensions for handling vectors:

// load a SQLite extension
function loadExtension(db: Database, path: string): Promise<void> {
  return new Promise((resolve, reject) => {
    db.loadExtension(path, (err) => {
      if (err) {
        reject(err)
      } else {
        resolve()
      }
    })
  })
}
Enter fullscreen mode Exit fullscreen mode

Setting up the Database

Tables are created for chat history and virtual table (named chatHistory, and vss_chatHistory respectively) using the sqlite-vss extension:

export async function setupDatabase(): Promise<Database> {
  db = await openDatabase()
  try {
    await loadExtension(db, vectorExtensionPath)
    console.log("vector extension loaded")
  } catch (err) {
    console.error("Failed to load vector extension", err)
    throw err
  }

  try {
    // load the SQLite vector extension
    // https://github.com/asg017/sqlite-vss
    await loadExtension(db, vssExtensionPathVSS)
    console.log("vss extension loaded successfully")
  } catch (err) {
    console.error("Failed to load vss extension", err)
    throw err
  }

  // Checking to make sure the extension was loaded properly
  await new Promise<void>((resolve, reject) => {
    db.get(
      "SELECT vss_version() AS version",
      (err, row: { version: number }) => {
        if (err) {
          console.error("Error running vss_version()", err)
          reject()
        } else {
          console.log("vss_version:", row.version) // 'v0.0.1'
          resolve()
        }
      }
    )
  })

  // Next, create the main chatHistory, and if that succeeds
  // then create the virtual table vss_chatHistory
  return new Promise((resolve, reject) => {
    // we are storing our vectors as TEXT in the "message_embedding" column
    db.run(
      `CREATE TABLE IF NOT EXISTS chatHistory (
        type TEXT,
        command TEXT,
        content TEXT,
        timestamp TEXT,
        message_embedding TEXT
      );`,
      (creationError) => {
        if (creationError) {
          console.error("Error creating chatHistory table", creationError)
          reject(creationError)
          return
        }

        console.log("Successfully created chatHistory table")
        db.run(
          `CREATE VIRTUAL TABLE IF NOT EXISTS vss_chatHistory using vss0(message_embedding(512));`,
          (creationError) => {
            if (creationError) {
              console.error(
                "Error creating vss_chatHistory table",
                creationError
              )
              reject(creationError)
              return
            }

            console.log("Successfully created vss_chatHistory virtual table")
            resolve(db)
          }
        )
      }
    )
  })
}
Enter fullscreen mode Exit fullscreen mode

Part 3: Main Functionality

Adding to Chat History

Continuing in our db.ts file, we insert a chat message along with its embedding into both the main table and the corresponding virtual table:

export async function addToChatHistory(
  type: "user" | "ai",
  command: string,
  content: string
): Promise<void> {
  const timestamp = new Date().toISOString()
  const messageEmbedding = await createMessageEmbedding(
    type,
    command,
    content,
    timestamp
  )
  return new Promise<void>(async (resolve, reject) => {
    // Insert into our chatHistory table
    db.run(
      "INSERT INTO chatHistory (type, command, content, timestamp, message_embedding) VALUES (?, ?, ?, ?, ?)",
      [type, command, content, timestamp, messageEmbedding],
      function (err) {
        if (err) {
          console.error("Error inserting into chatHistory", err)
          db.run("ROLLBACK")
          reject(err)
          return
        }

        const lastRowId = this.lastID
        // Insert into our vss_chatHistory virtual table, keeping the rowid values in sync with chatHistory
        db.run(
          "INSERT INTO vss_chatHistory(rowid, message_embedding) VALUES (?, ?)",
          [lastRowId, messageEmbedding],

          (err) => {
            if (err) {
              console.error("Error inserting into vss_chatHistory", err)
              reject(err)
              return
            }
            return resolve()
          }
        )
      }
    )
  })
}
Enter fullscreen mode Exit fullscreen mode

Searching Chat History

We perform a vector search using the k-nearest neighbors algorithm to find the closest vectors to our query:

// This function performs the vector search, using "k nearest neighbors" algorithm to
// find the closest 10 (from the 'limit 10') vectors to our search input.
// The vectors are sorted by "distance", where the smallest "distance"
// is the vector most similar to our query embedding
export async function searchChatHistory(query: string) {
  const queryEmbedding = await createQueryEmbedding(query)
  return new Promise((resolve, reject) => {
    db.all(
      `with matches as (
          select rowid,
          distance
          from vss_chatHistory where vss_search(message_embedding, (?))
          limit 10
        )
        select
        chatHistory.type,
        chatHistory.command,
        chatHistory.content,
        chatHistory.timestamp,
        matches.distance
        from matches 
        left join chatHistory on chatHistory.rowid = matches.rowid`,
      [queryEmbedding],
      function (err: any, result: any) {
        if (err) {
          return reject(err)
        }
        return resolve(result)
      }
    )
  })
}
Enter fullscreen mode Exit fullscreen mode

Final Steps

In the main function in the index.ts, we combine all the parts, setting up embeddings, the database, adding some test data, and performing a vector search:

import { addToChatHistory, searchChatHistory, setupDatabase } from "./db"
import { setupEmbeddings } from "./embed"

async function main() {
  await setupEmbeddings()
  await setupDatabase()
  await addToChatHistory("user", "LIST_PROJECTS", "list projects")
  await addToChatHistory(
    "ai",
    "LIST_PROJECTS",
    "todo-list,react-notepad,gpt-engineer,pocketbase"
  )
  await addToChatHistory("user", "SEARCH_WEB", "search for keanu reeves")
  const result = await searchChatHistory("my favorite actor")
  // should log out a list of vectors sorted by relevance to the query "my favorite actor"
  console.log(result)
}

main()
Enter fullscreen mode Exit fullscreen mode

Conclusion

This blog post covered a simple example of how to combine modern NLP techniques with a relational database to efficiently manage vector data.

Whether you're constructing an AI-powered chat application or merely checking out new techniques for text searching, the code we've examined in this article should shed light on how to employ vector embeddings with a familiar database like SQLite. By doing so, you can develop a complex backend system that unveils new possibilities in semantic text search for both you and your users.

Make sure to check out the sqlite-vss GitHub repository for more information on this powerful extension!

Top comments (0)