In this blog post, we will explore how to implement CRUD (Create, Read, Update, Delete) operations using Natural Language Processing (NLP) with the Microsoft.Extensions.AI library in a .NET Web API application. We will utilize the power of NLP to interact with our application through natural language queries and perform CRUD operations on a light management system.
Create a .NET Web API Application
First, let's create a new Web API project using the dotnet CLI:
dotnet new webapi -o lightsmeai
This command generates a basic Web API project named "lightsmeai".
Add Required Packages
Next, we need to add the necessary packages to our project. These packages include Azure.AI.OpenAI, Azure.Identity, DotNetEnv, Microsoft.AspNetCore.OpenApi, Microsoft.Extensions.AI, and more. Run the following commands to install the required packages:
dotnet add package Azure.AI.OpenAI --version 2.1.0-beta.2
dotnet add package Azure.Identity --version 1.13.1
dotnet add package DotNetEnv --version 3.1.1
dotnet add package Microsoft.AspNetCore.OpenApi --version 8.0.1
dotnet add package Microsoft.Extensions.AI --version 9.0.0-preview.9.24556.5
dotnet add package Microsoft.Extensions.AI.AzureAIInference --version 9.0.0-preview.9.24556.5
dotnet add package Microsoft.Extensions.AI.OpenAI --version 9.0.0-preview.9.24556.5
dotnet add package Swashbuckle.AspNetCore --version 6.4.0
Program.cs
In the Program.cs
file, we set up the necessary configurations and services for our application. Here's the code snippet:
using Azure;
using Azure.AI.Inference;
using Azure.AI.OpenAI;
using DotNetEnv;
using Microsoft.Extensions.AI;
// Get keys from configuration
Env.Load(".env");
string githubKey = Env.GetString("GITHUB_KEY");
var builder = WebApplication.CreateBuilder(args);
// Add services to the container.
builder.Services.AddControllers();
builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen();
// Add the chat client
IChatClient innerChatClient = new ChatCompletionsClient(
endpoint: new Uri("<https://models.inference.ai.azure.com>"),
new AzureKeyCredential(githubKey))
.AsChatClient("gpt-4o-mini");
builder.Services.AddChatClient(chatClientBuilder => chatClientBuilder
.UseFunctionInvocation()
// .UseLogging()
.Use(innerChatClient));
// Register embedding generator
builder.Services.AddSingleton<IEmbeddingGenerator<string, Embedding<float>>>(sp =>
new AzureOpenAIClient(new Uri("<https://models.inference.ai.azure.com>"),
new AzureKeyCredential(githubKey))
.AsEmbeddingGenerator(modelId: "text-embedding-3-large"));
builder.Services.AddLogging(loggingBuilder =>
loggingBuilder.AddConsole().SetMinimumLevel(LogLevel.Trace));
var app = builder.Build();
// Configure the HTTP request pipeline.
if (app.Environment.IsDevelopment())
{
app.UseSwagger();
app.UseSwaggerUI();
}
app.UseStaticFiles(); // Enable serving static files
app.UseRouting(); // Must come before UseEndpoints
app.UseAuthorization();
app.MapControllers();
// Serve index.html as the default page
app.MapFallbackToFile("index.html");
app.Run();
Add ChatController
Let's create a ChatController
to handle natural language queries and perform CRUD operations. Here's the code for the ChatController
:
using System.Collections.Generic;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.AI;
namespace lightsmeai.Controllers
{
[ApiController]
[Route("[controller]")]
public class ChatController : ControllerBase
{
private readonly IChatClient _chatClient;
private readonly IEmbeddingGenerator<string, Embedding<float>> _embeddingGenerator;
private readonly ChatOptions _chatOptions;
public ChatController(
IChatClient chatClient,
IEmbeddingGenerator<string, Embedding<float>> embeddingGenerator,
ChatOptions chatOptions
)
{
_chatClient = chatClient;
_embeddingGenerator = embeddingGenerator;
_chatOptions = chatOptions;
}
[HttpPost("chat")]
public async Task<ActionResult<IEnumerable<string>>> Chat(string userMessage)
{
var messages = new List<ChatMessage>
{
new(Microsoft.Extensions.AI.ChatRole.System, """
You answer any question, Hey there, I'm Lumina, your friendly lighting assistant!
I can help you with all your lighting needs.
You can ask me to turn on the light, get the status of the light,
turn off all the lights, add a new light, or delete the light.
For update you should create an object like below.
some time the user will pass all key values or one or two key value.
{ "id": 6, "name": "Chandelier", "Switched": false }
Just let me know what you need and I'll do my best to help!
"""),
new(Microsoft.Extensions.AI.ChatRole.User, userMessage)
};
var response = await _chatClient.CompleteAsync(messages,_chatOptions);
return Ok(response.Message.Text);
}
}
}
Remove WeatherForecast Related Code
We will remove the WeatherForecast-related code from the Program.cs
file as it is not relevant to our CRUD operations.
Add Microsoft.EntityFrameworkCore.InMemory
To manage our light data, we will use an in-memory database provided by Microsoft.EntityFrameworkCore.InMemory. Install the package using the following command:
dotnet add package Microsoft.EntityFrameworkCore.InMemory
Add Model and its DBContext
Let's define a Light
model to represent our light entities and create a LightContext
to manage the in-memory database. Here's the code:
using System.Text.Json.Serialization;
namespace lightsmeai.Models
{
public class Light
{
[JsonPropertyName("id")]
public int Id { get; set; }
[JsonPropertyName("name")]
public string? Name { get; set; }
[JsonPropertyName("Switched")]
public bool? Switched { get; set; }
}
}
using lightsmeai.Models;
using Microsoft.AspNetCore.Mvc;
using Microsoft.EntityFrameworkCore;
namespace lightsmeai.Controllers
{
[Route("api/[controller]")]
[ApiController]
public class LightController : ControllerBase
{
// Initialize the context within the constructor
private readonly LightContext _context;
public LightController()
{
var options = new DbContextOptionsBuilder<LightContext>()
.UseInMemoryDatabase("LightList")
.Options;
_context = new LightContext(options);
}
// CRUD operations implementation...
}
}
Check the LightController REST API Endpoints in Swagger
After setting up the LightController
, we can check the REST API endpoints in Swagger to interact with our light management system.
Add Function in Program.cs
In the Program.cs
file, we will add functions to expose the CRUD operations as tools for the chat client. Here's the updated code:
using Azure;
using Azure.AI.Inference;
using Azure.AI.OpenAI;
using DotNetEnv;
using lightsmeai.Controllers;
using Microsoft.Extensions.AI;
// Get keys from configuration
Env.Load(".env");
string githubKey = Env.GetString("GITHUB_KEY");
var builder = WebApplication.CreateBuilder(args);
// Add services to the container.
builder.Services.AddControllers();
builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen();
// Add the chat client
IChatClient innerChatClient = new ChatCompletionsClient(
endpoint: new Uri("<https://models.inference.ai.azure.com>"),
new AzureKeyCredential(githubKey))
.AsChatClient("gpt-4o-mini");
builder.Services.AddChatClient(chatClientBuilder => chatClientBuilder
.UseFunctionInvocation()
.UseLogging()
.Use(innerChatClient));
// Register embedding generator
builder.Services.AddSingleton<IEmbeddingGenerator<string, Embedding<float>>>(sp =>
new AzureOpenAIClient(new Uri("<https://models.inference.ai.azure.com>"),
new AzureKeyCredential(githubKey))
.AsEmbeddingGenerator(modelId: "text-embedding-3-large"));
builder.Services.AddLogging(loggingBuilder =>
loggingBuilder.AddConsole().SetMinimumLevel(LogLevel.Trace));
var light = new LightController();
var getAllLightsTool = AIFunctionFactory.Create(light.GetLights);
var getLightTool = AIFunctionFactory.Create(light.GetLight);
var createLightTool = AIFunctionFactory.Create(light.AddLight);
var updateLightTool = AIFunctionFactory.Create(light.UpdateLight);
var deleteLightTool = AIFunctionFactory.Create(light.DeleteLight);
var chatOptions = new ChatOptions
{
Tools = new[]
{
getAllLightsTool,
getLightTool,
createLightTool,
updateLightTool,
deleteLightTool
}
};
builder.Services.AddSingleton(light);
builder.Services.AddSingleton(chatOptions);
var app = builder.Build();
// Configure the HTTP request pipeline.
if (app.Environment.IsDevelopment())
{
app.UseSwagger();
app.UseSwaggerUI();
}
app.UseStaticFiles(); // Enable serving static files
app.UseRouting(); // Must come before UseEndpoints
app.UseAuthorization();
app.MapControllers();
// Serve index.html as the default page
app.MapFallbackToFile("index.html");
app.Run();
Few Changes in ChatController
We will make a few adjustments to the ChatController
to utilize the tools we exposed in the previous step. Here's the updated code:
using System.Collections.Generic;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.AI;
namespace lightsmeai.Controllers
{
[ApiController]
[Route("[controller]")]
public class ChatController : ControllerBase
{
private readonly IChatClient _chatClient;
private readonly IEmbeddingGenerator<string, Embedding<float>> _embeddingGenerator;
private readonly ChatOptions _chatOptions;
public ChatController(
IChatClient chatClient,
IEmbeddingGenerator<string, Embedding<float>> embeddingGenerator,
ChatOptions chatOptions
)
{
_chatClient = chatClient;
_embeddingGenerator = embeddingGenerator;
_chatOptions = chatOptions;
}
[HttpPost("chat")]
public async Task<ActionResult<IEnumerable<string>>> Chat(string userMessage)
{
var messages = new List<ChatMessage>
{
new(Microsoft.Extensions.AI.ChatRole.System, """
You answer any question, Hey there, I'm Lumina, your friendly lighting assistant!
I can help you with all your lighting needs.
You can ask me to turn on the light, get the status of the light,
turn off all the lights, add a new light, or delete the light.
For update you should create an object like below.
some time the user will pass all key values or one or two key value.
{ "id": 6, "name": "Chandelier", "Switched": false }
Just let me know what you need and I'll do my best to help!
"""),
new(Microsoft.Extensions.AI.ChatRole.User, userMessage)
};
var response = await _chatClient.CompleteAsync(messages,_chatOptions);
return Ok(response.Message.Text);
}
}
}
Check the ChatController REST API Endpoints in Swagger
Finally, we can check the REST API endpoints in Swagger to interact with our chat controller and perform CRUD operations using natural language queries.
With this setup, users can interact with our light management system through natural language queries, and the application will respond with appropriate actions based on the user's input. The Microsoft.Extensions.AI library and the power of NLP enable us to create a more intuitive and user-friendly interface for managing lights.
Top comments (0)