DEV Community

Paweł Ciosek
Paweł Ciosek

Posted on

CoLlama 🦙 - code autocompletion (local machine and free)

In the recent post, I write about using codellama as our local version of code assistant.

There was a sentence:

The extension do not support code completion, if you know extension that support code completion, please let me know in the comments. 🙏

You know what? I found a way to use llama to support code autocompletion! 🥳

As you guess 🤓 you need ollama installed locally, if you don't have, take a look to the previous post 🙏

So what we need:

Let's configure it together, step by step! 🧑‍💻

Download codellama:7b-code

Extension require this model, you cannot choose any other 🫣, but it is pretty good optimised to the job ⚙️

ollama pull codellama:7b-code
Enter fullscreen mode Exit fullscreen mode

Image description

Install CodyAI extension

You gonna find it in an extension browser bar by searching "cody ai"

Image description

Configure CodyAI to use ollama as your companion

  • go to vs code settings
  • put inside search bar: cody.autocomplete.advanced.provider
  • you should see the option
  • set it to "experimental-ollama"

Image description

Make sure CodyAI autocompletion is enabled

  • Click on CodyAI icon at the bottom right bar Image description
  • make sure option "Code autocomplete" is enabled

Image description

Make sure you are running ollama

That seems obvious, but it's worth reminding! 😅

Test it!

Image description

Summary

So right now you can have ollama supporting you as chat assistant and with code autocompletion as well! 🤩

Locally, secure and free! 🆓

Did you try ollama as code companion? What do you think about it?

Top comments (1)

Collapse
 
moinulmoin profile image
Moinul Moin