DEV Community

Darren Fuller
Darren Fuller

Posted on

Managing LLM prompts in DotNet with DotPrompt

We've been working on some internal tooling lately which makes use of AI to automate a number of tasks and to help reduce the time to production for customer use cases. While building this one of the things that became increasingly annoying was that, every time we needed to modify the system or user prompt we would have to rebuild and re-package, deploy, and then test again. While not massively time-consuming, it's a pain and a few minutes each time adds up over a day.

So, easy solution, lets move the prompts to an external file so we can edit them without needing to rebuild. But...

What if we could make them template driven as well, and what if we could include configuration information, what if we could load them up-front and refer to them by name, and what if it was thread safe access by default (because this app is very Task oriented)?

And so, we built DotPrompt and, published it on GitHub, pushed the packages out to NuGet and made it available under an MIT license.

I started piecing it but wanted to see if someone else had done this already. Turns out there's a couple of efforts out there, like Firebase Genkit Dotprompt, but it's not for the DotNet ecosystem. There were a couple of others as well but they were for Python.

Nobody ever thinks of the DotNet devs 😭

One of the things I didn't want to do was to come up with a new format and parser for this. JSON would have been an obvious choice, but it's not exactly user friendly to write multi-line strings in. XML? No, screw that. So what about YAML? Well, it's structured, has great support for multi-line strings, and is pretty user-friendly to read and write.

So, what does a prompt file look like? Well, it looks like this.

name: Example
config:
  outputFormat: text
  temperature: 0.9
  maxTokens: 500
  input:
    parameters:
      topic: string
      style?: string
    default:
      topic: social media
prompts:
  system: |
    You are a helpful research assistant who will provide descriptive responses for a given topic and how it impacts society
  user: |
    Explain the impact of {{ topic }} on how we engage with technology as a society
    {% if style -%}
    Can you answer in the style of a {{ style }}
    {% endif -%}
fewShots:
  - user: What is Bluetooth
    response: Bluetooth is a short-range wireless technology standard that is used for exchanging data between fixed and mobile devices over short distances and building personal area networks.
  - user: How does machine learning differ from traditional programming?
    response: Machine learning allows algorithms to learn from data and improve over time without being explicitly programmed.
  - user: Can you provide an example of AI in everyday life?
    response: AI is used in virtual assistants like Siri and Alexa, which understand and respond to voice commands.
Enter fullscreen mode Exit fullscreen mode

We have our configuration, and name we can refer to it as later. We've got parameters (with optional items denoted by the question mark ?), default values, the system prompt, the user prompt, and some few-shot prompts.

You might notice that the prompt has template instructions in there. To create this we used the Fluid library which is based on the Liquid template language from Shopify. It's got some great features in it and helps to make the prompt generation pretty powerful.

The library expects the prompt files (all with the file extension of .prompt) to be in a folder called prompts in the current working directory. This means you can run your app from different directories and use different versions of the prompt files, useful if you have different tenants to target.

So, how do you use it? Well, after installing it using NuGet you can either load a prompt file directly (see the repo), or use the prompt manager like this.

var promptManager = new PromptManager();
var promptFile = promptManager.GetPromptFile("example");
Enter fullscreen mode Exit fullscreen mode

And now promptFile has all the information needed to generate the prompts. To generate the system prompt or user prompt you pass in a dictionary of values to fill in the template (or an empty dictionary, or null if there aren't any).

var systemPrompt = promptFile.GetSystemPrompt(null);
var userPrompt = promptFile.GetUserPrompt(new Dictionary<string, object>
{
    { "topic", "bluetooth" },
    { "style", "used car salesman" }
});
Enter fullscreen mode Exit fullscreen mode

And we've now got our generated prompts πŸŽ‰

We even built out some extension methods for OpenAI as we're big users of Azure OpenAI, so you don't have to generate the prompts and use them in this way, you can swap it for a method call.

var promptValues = new Dictionary<string, object>
{
    { "topic", "bluetooth" },
    { "style", "used car salesman" }
};

var completion = await client.CompleteChatAsync(
    promptFile.ToOpenAiChatMessages(promptValues),
    promptFile.ToOpenAiChatCompletionOptions()
);
Enter fullscreen mode Exit fullscreen mode

So a full example would look something like this.

using System.ClientModel;
using Azure.AI.OpenAI;
using DotPrompt;
using DotPrompt.Extensions.OpenAi;

var openAiClient = new(new Uri("https://endpoint"), new ApiKeyCredential("abc123"));
var client = openAiClient.GetChatClient("model");

var promptManager = new PromptManager();
var promptFile = promptManager.GetPromptFile("example");

var promptValues = new Dictionary<string, object>
{
    { "topic", "bluetooth" },
    { "style", "used car salesman" }
};

var completion = await client.CompleteChatAsync(
    promptFile.ToOpenAiChatMessages(promptValues),
    promptFile.ToOpenAiChatCompletionOptions()
);

var response = completion.Value;
Console.WriteLine(response.Content[0].Text);
Enter fullscreen mode Exit fullscreen mode

The call to ToOpenAiChatMessages would also include the few-shot prompts if they were in the prompt file as well.

And that's it, call to the LLM is done, we can change the parameter values on each call, and we change the prompts without re-compiling. We integrated this into our internal app and just made the whole process of tweaking so much easier. And we'll continue to dog-food the library.

We did build in a few more things as well, because why not. So it also has.

  • Ability to parse a prompt file from a stream
  • parameter value validation at runtime
  • Interfaces to allow the file store and the prompt manager to be mocked out or used in dependency injection scenarios
  • Because the file store has an interface, the prompt manager takes an instance of that, so you can build your own stores in case you want to hold them in something like a database (see the repo for an example)

We should probably come up with a logo for it as well, right now it's the boring NuGet default.

Is there something you'd like to see it do? If so then feel free to raise it over on GitHub.

Top comments (0)