DEV Community

Cover image for How to chat with Local LLM in Obsidian
Luca Liu
Luca Liu

Posted on

How to chat with Local LLM in Obsidian

Introduction

In Why You Should Try a Local LLM Model—and How to Get Started, I introduced how to set up a local LLM model using LM Studio.

In this article, I will show you how to chat with a local LLM model in Obsidian.

Method

Obsidian’s Copilot plugin allows you to connect to a custom model, enabling you to use AI-generated insights directly within your markdown workspace. Here’s a step-by-step guide to setting it up.

Step 1: Install the Copilot Plugin in Obsidian

  1. Open Obsidian and go to Settings > Community Plugins.
  2. Enable Community Plugins if you haven’t already.
  3. Search for CopilotCopilot in the plugin library and click Install.
  4. Once installed, enable the plugin and access its settings via Settings > Copilot.

Step 2: Add Your Local LLM Model in Copilot

  1. In the Provider field, select lm-studio.
  2. Enter the Model Name.
  3. Click Verify Connection to ensure that Copilot can communicate with the model.

Once the connection is verified successfully, Click Add Model. Your custom model will now appear in the list of available models.

Step 3. Enable the Custom Model in General Settings

  1. In Copilot > General Settings, Select the custom model you just added.
  2. make sure to enable CORS in LM Studio and Copilot

Step 4: open chat box

Press Ctrl + Shift + P(Windows)or Cmd + Shift + P(Mac) and select Open Copilot Chat Window

Now you can chat with your local LLM model in Obsidian's sidebar.

Conclusion

This is a simple way to chat with your local LLM model in Obsidian.


Explore more

Thank you for taking the time to explore data-related insights with me. I appreciate your engagement.

🚀 Connect with me on LinkedIn

Top comments (0)