Let's add OpenAI based Embeddings to a Phoenix app and store the vector in the database so we can use it for similarity searches.
This resulting code for this article is captured in this sample project:
https://github.com/byronsalty/questify
Specifically this commit:
https://github.com/byronsalty/questify/commit/cba277ce8104c6f3b29dc2a6f602158ba4d1f62a
Steps:
- Add PG Vector to deps
- Add a migration to install Vector db extension
- Add Postgrex.Types
- Configure OpenAI embedding endpoints
- Hit the API endpoint
- Setup your schema for vector types
- Incorporate vectors into your queries
Add PG Vector to deps
In mix.exs
:
{:pgvector, "~> 0.2.0"},
Add Migration to install the vector extension
Create a migration to install the PGVector extension:
defmodule Questify.Repo.Migrations.AddPgVector do
use Ecto.Migration
def up do
execute "CREATE EXTENSION IF NOT EXISTS vector"
end
def down do
execute "DROP EXTENSION vector"
end
end
Add Postgrex.Types
This is so you can use the term vector
in your type definitions.
Add a file postgrex_types.ex
in your /lib folder with the following contents:
Postgrex.Types.define(
Questify.PostgrexTypes,
[Pgvector.Extensions.Vector] ++ Ecto.Adapters.Postgres.extensions(),
[]
)
Add this to your config.exs
:
config :questify, Questify.Repo, types: Questify.PostgrexTypes
Configure your app with the OpenAI endpoints
Here are the relevant lines to add to your configuration. I put this into the runtime.exs
file:
openai_api_key = System.get_env("OPENAI_API_KEY") ||
raise """
environment variable OPENAI_API_KEY is missing
"""
config :questify, :openai,
openai_api_key: openai_api_key,
embedding_url: "https://api.openai.com/v1/embeddings",
embedding_model: "text-embedding-ada-002"
Hit the API
The embedding API will simply take in a text string and will return a JSON response with the embedding information.
You can hit the API with HTTPoison like so:
response =
HTTPoison.post(
embedding_url,
Jason.encode!(%{
input: text,
model: Keyword.get(opts, :model, embedding_model)
}),
[
{"Content-Type", "application/json"},
{"Authorization", "Bearer #{openai_api_key}"}
]
)
See the whole file here: https://github.com/byronsalty/questify/blob/cba277ce8104c6f3b29dc2a6f602158ba4d1f62a/lib/questify/embeddings.ex
Add a vector to one of your schemas
Add a migration like this:
defmodule Questify.Repo.Migrations.AddCommandEmbeddings do
use Ecto.Migration
def change do
alter table(:actions) do
add :embedding, :vector, size: 1536
end
end
end
Then update your schema with a new vector type field like this:
field :embedding, Pgvector.Ecto.Vector
Incorporate Vectors into Queries
You'll need to add this line at the top of an model where you plan to query with PGVector specific functions - importantly including the "distance" ones that we're looking to use.
import Pgvector.Ecto.Query
Now you can query like the following:
def get_action_by_text(location, text) do
embedding = Questify.Embeddings.embed!(text)
min_distance = 1.0
Repo.all(
from a in Action,
order_by: cosine_distance(a.embedding, ^embedding),
limit: 1,
where: cosine_distance(a.embedding, ^embedding) < ^min_distance,
where: a.from_id == ^location.id
)
end
Follow along for more ways to add LLM related capabilities into your Phoenix apps.
Top comments (0)