DEV Community

Cover image for Join us for the Open Source AI Challenge with pgai and Ollama: $3,000 in Prizes!

Join us for the Open Source AI Challenge with pgai and Ollama: $3,000 in Prizes!

dev.to staff on October 30, 2024

We are thrilled to team up with Timescale to bring the community our newest challenge. We think you'll like this one. Running through November 10,...
Collapse
 
jonatas profile image
Jônatas Davi Paganini

This looks awesome! I love hackathons!

Collapse
 
jess profile image
Jess Lee

Good luck everyone!

Collapse
 
thnh_huytrng_e0608807 profile image
Thành Huy Trương

111

Collapse
 
programordie profile image
programORdie

I don't think my 5 years old laptop with 512mb of GPU ram will be able to run a 7B LLM 😂

Collapse
 
delaaja profile image
Dela

If we want to utilize Ollama model, does that mean that we have to host the model ourselves and open the access to public?

Collapse
 
bytrangle profile image
Trang Le

Did you get an answer for your question? I think we have to deploy the model, otherwise how would the judges try the mout?

Collapse
 
bytrangle profile image
Trang Le • Edited

So many questions since I'm not familiar with LLMs at all:

  1. Does one need existing data stored in PostgreSQL?
  2. Since extensions like pgai and pgvector are written in Python, I assume the AI application should also be written in Python?
Collapse
 
emadmokhtar profile image
Emad Mokhtar

No need to write in Python. You can create a Posgres function and invoke this function as SQL statement from any PG client.

select generate_dag_response("prompt");
Enter fullscreen mode Exit fullscreen mode

This means the chat response is generated from pgai and you are getting it back via Postgres.

Collapse
 
andypiper profile image
Andy Piper • Edited

Is this compatible with the Open Source AI Definition just ratified by the OSI, please? I don't see Ollama on the list of endorsements there.

Collapse
 
mrdoe profile image
MrDoe

Depends on the model you are using. Ollama is basically just a wrapper for llama.cpp and llama.cpp serves as a host for the interference with an LLM. Both, Ollama and Llama.cpp are published under open source licenses, but there are models which are not open source.

Collapse
 
andypiper profile image
Andy Piper

Yes, so the answer is (I think) that this contest is not compatible with the OSI Open Source AI Definition. Thanks.

Collapse
 
bytrangle profile image
Trang Le

If I use pgai and create vector embeddings, does this mean that I've already used two tools: pgai and pgai vectorizer, thereby meeting the prompt requirement?

Collapse
 
douglasfugazi profile image
Douglas Fugazi

YAY!!!

Collapse
 
jeasoft profile image
Jearel Alcantara

Hey guys, please fix the url for dev.to/timescale

Collapse
 
jess profile image
Jess Lee

Thanks, fixed!

Collapse
 
bardui profile image
bardui

Pre-built UI components to help you create stunning websites in no time with bardui.com.

Collapse
 
raj_acharya_00 profile image
Raj Acharya

hell yeah

Collapse
 
emadmokhtar profile image
Emad Mokhtar

I built a CLI that needs to connect to TimeScaleDB and OpenAI. How can I share the credentials with the judges? Otherwise, they can't run the app and I would rather not publish them to my repo.

Collapse
 
emadmokhtar profile image
Emad Mokhtar

@jess Can you help me please?

Collapse
 
dellikilari profile image
dellikilari

This is great, all the best for all participants.

Collapse
 
thnh_huytrng_e0608807 profile image
Thành Huy Trương

111

Collapse
 
zaheer_abbas_58441d2ce9f2 profile image
zaheer Abbas

❤️

Collapse
 
amit_devgirkar_da2dccf50f profile image
Amit Devgirkar

👍👍sir

Collapse
 
dhruv_arora_7deeba490ede3 profile image
Dhruv Arora

the discord invite says it's invalid?

Collapse
 
dhruv_arora_7deeba490ede3 profile image
Dhruv Arora

nvm it worked now.

Collapse
 
linkmate profile image
Linkmate

wow