I have installed Devika locally using docker.
As a prerequisit, I have already installed docker in my ubuntu.
There are three main components in Devika,
- Frontend server which runs on port 3000
- Backend Server
- Ollama and ollama LLM models which runs on port 11434
Devika can be use different non-open source LLM Models like OpenAI (chatgpt), ClaudeAI etc..
Here are step by step instructions.
Clone repository in your local directory using terminal.
git clone https://github.com/stitionai/devika.git
cd devika
copy config.toml and edit to update API keys.
cp sample.config.toml config.toml
sudo nano config.toml
Now, you can start installing process using docker compose. Devika's repository already contains docker compose file and no need to change anything.
sudo docker compose up
Now in the terminal, you can see the devika's containers starting.
finally you can see the port devika is running
You can go to browser and run localhost:3000.
Top comments (0)