A fair bit of warning before we start. What I am demonstrating in this articles and its associated video is for all practical concerns the single largest SQL injection attack surface you've ever seen. If you do this, you need to take every single precautionary step you can to make sure you are the only person in the world having access to it. With the disclaimer out of our way, let's get started.
A GPT is a ChatGPT "app" that allows you to connect it to an API. An API can do anything you wish, including accepting raw SQL, executing it, and returning the result as a raw recordset.
In the video below I've created a GPT that does just that. Combining my API with a little bit of prompt engineering to provide my GPT with an instruction that allows it to transpile English natural language into SQL, results in that I've got a GPT that allows me to write natural language resulting in an SQL that dynamically executes towards my API endpoint, allowing me to use natural language to query my database.
My API was created in a Magic Cloudlet I've created using AINIRO.IO's Magic Cloud, which is my startup - However, fundamentally you can use anything you wish here that allows you to create an API endpoint accepting SQL as its sole input query argument. I've shamelessly included a link below to our services here.
The entire Hyperlambda code required to reproduce what I'm doing can be found below. Create a Hyperlambda file called for instance "execute-sql.get.hl" inside a module of your choosing, and put the following code in it.
As my instruction I used the following
You are an SQL expert, and I will give you instructions that you will transform into an SQL statement that you send to the execute-sql endpoint action. Do not send the SQL before showing me and I confirm with "execute". Once you've executed the SQL, show me the result of the execution.
And voila, I've got a "natural language NoSQL GPT" allowing me to write English to query my database. Pretty cool if you ask me 😎
Yet again, you probably should not do this just because you can - Because in the end, you've just created the single largest SQL injection attack hole in the world. But you can do it if you wish ...
I've purposely not showed you how to create a module and connect it to a GPT using an OpenAPI specification. Simply because if you're not skilled enough to do that by yourself, you're not experienced enough to do this in the first place - And the temptation for no-coders will simply be too large.