DEV Community

Cover image for [Typia] LLM Function Calling Application Composer in TypeScript
Jeongho Nam
Jeongho Nam

Posted on

[Typia] LLM Function Calling Application Composer in TypeScript

Preface

import { ILlmApplication } from "@samchon/openapi";
import typia, { tags } from "typia";

const app: ILlmApplication = typia.llm.application<YourClassType>();
Enter fullscreen mode Exit fullscreen mode

LLM function calling schemas with native TypeScript type.

Don't write LLM (Large Language Model) function calling schema manually. Just call the typia.llm.application<YourClassType>() function with your class (or interface) type. LLM function calling application schema would be automatically generated just by analyzing the YourClassType in the compilation level.

Also, it is possible to compose the LLM function calling application schema from an OpenAPI (Swagger) document. If you have a backend server with Swagger document of it, let's make it A.I. Chatbot directly.

LLM Function Calling

OpenAI Logo

LLM selects proper function and fill arguments.

In nowadays, most LLM (Large Language Model) like "OpenAI" are supporting "function calling" feature. The "LLM function calling" means that LLM automatically selects a proper function and fills parameter values from conversation with the user (may by chatting text).

https://platform.openai.com/docs/guides/function-calling

Demonstration

Playground

💻 Playground Link: LLM function calling schemas

Try it on the web browser.

Here is the playground link you can generate LLM (Large Language Model) function calling application schema in the web browser with the typia.llm.application<App>() function.

This playground link is just prepared to demonstrate that typia.llm.application<App>() function is working properly, but it is okay to utilizng the playground link to generate the LLM function calling schemas actually what you need.

Description Comment

import { ILlmApplication } from "@samchon/openapi";
import typia, { tags } from "typia";

const app: ILlmApplication = typia.llm.application<BbsArticleController>();

console.info(app);

interface BbsArticleController {
  /**
   * Create a new article.
   *
   * Writes a new article and archives it into the DB.
   *
   * @param input Information of the article to create
   * @returns Newly created article
   */
  create(input: IBbsArticle.ICreate): Promise<IBbsArticle>;

  /**
   * Update an article.
   *
   * Updates an article with new content.
   *
   * @param id Target article's {@link IBbsArticle.id}
   * @param input New content to update
   */
  update(
    id: string & tags.Format<"uuid">,
    input: IBbsArticle.IUpdate,
  ): Promise<void>;

  /**
   * Erase an article.
   *
   * Erases an article from the DB.
   *
   * @param id Target article's {@link IBbsArticle.id}
   */
  erase(id: string & tags.Format<"uuid">): Promise<void>;
}

/**
 * Article entity.
 *
 * `IBbsArticle` is an entity representing an article in the BBS (Bulletin Board System).
 */
interface IBbsArticle extends IBbsArticle.ICreate {
  /**
   * Primary Key.
   */
  id: string & tags.Format<"uuid">;

  /**
   * Creation time of the article.
   */
  created_at: string & tags.Format<"date-time">;

  /**
   * Last updated time of the article.
   */
  updated_at: string & tags.Format<"date-time">;
}
namespace IBbsArticle {
  /**
   * Information of the article to create.
   */
  export interface ICreate {
    /**
     * Title of the article.
     *
     * Representative title of the article.
     */
    title: string;

    /**
     * Content body.
     *
     * Content body of the article writtn in the markdown format.
     */
    body: string;

    /**
     * Thumbnail image URI.
     *
     * Thumbnail image URI which can represent the article.
     *
     * If configured as `null`, it means that no thumbnail image in the article.
     */
    thumbnail:
      | null
      | (string & tags.Format<"uri"> & tags.ContentMediaType<"image/*">);
  }

  /**
   * Information of the article to update.
   *
   * Only the filled properties will be updated.
   */
  export type IUpdate = Partial<ICreate>;
}
Enter fullscreen mode Exit fullscreen mode

Description code is very important.

As you can see, above example code is writing detailed descriptions for every functions and their parameter/return types. Such detailed descriptions are very important to teach the purpose of the function to the LLM (Language Large Model), and LLM actually determines which function to call by the description.

Therefore, don't forget to writing detailed descriptions. It's very import feature for the LLM function calling.

Parameters Separation

import { ILlmApplication, ILlmSchema, LlmTypeChecker } from "@samchon/openapi";
import typia, { tags } from "typia";

const app: ILlmApplication = typia.llm.application<BbsArticleController>({
  separate: (schema: ILlmSchema) =>
    LlmTypeChecker.isString(schema) && schema.contentMediaType !== undefined,
});

console.log(JSON.stringify(app, null, 2));

interface BbsArticleController {
  /**
   * Create a new article.
   *
   * Writes a new article and archives it into the DB.
   *
   * @param input Information of the article to create
   * @returns Newly created article
   */
  create(input: IBbsArticle.ICreate): Promise<IBbsArticle>;

  /**
   * Update an article.
   *
   * Updates an article with new content.
   *
   * @param id Target article's {@link IBbsArticle.id}
   * @param input New content to update
   */
  update(
    id: string & tags.Format<"uuid">,
    input: IBbsArticle.IUpdate,
  ): Promise<void>;

  /**
   * Erase an article.
   *
   * Erases an article from the DB.
   *
   * @param id Target article's {@link IBbsArticle.id}
   */
  erase(id: string & tags.Format<"uuid">): Promise<void>;
}
Enter fullscreen mode Exit fullscreen mode

Parameter values from both LLM and Human sides.

When composing parameter arguments through the LLM (Large Language Model) function calling, there can be a case that some parameters (or nested properties) must be composed not by LLM, but by Human. File uploading feature, or sensitive information like secret key (password) cases are the representative examples.

In that case, you can configure the LLM function calling schemas to exclude such Human side parameters (or nested properties) by ILlmApplication.options.separate property. Instead, you have to merge both Human and LLM composed parameters into one by calling the HttpLlm.mergeParameters() before the LLM function call execution.

LLM Function Call to Restful API

flowchart
  subgraph "OpenAPI Specification"
    v20("Swagger v2.0") --upgrades--> emended[["OpenAPI v3.1 (emended)"]]
    v30("OpenAPI v3.0") --upgrades--> emended
    v31("OpenAPI v3.1") --emends--> emended
  end
  subgraph "OpenAPI Generator"
    emended --normalizes--> migration[["Migration Schema"]]
    migration --"Artificial Intelligence"--> lfc{{"LLM Function Calling Application"}}
  end
Enter fullscreen mode Exit fullscreen mode

You also can compose LLM function calling schemas from the OpenAPI (Swagger) document.

Until now, I've introduced how to convert from native TypeScript class (interface) type to the LLM function calling application schema. By the way, someone may want something different: "Is it possible to do the same thing to the OpenAPI (Swagger) Document?"

My answer is yes, but by my another library @samchon/openapi. If you have any type of HTTP protocol backend server and it can generate the OpenAPI (Swagger) document, you can develop an A.I. Chatbot service directly just with @samchon/openapi.

Let's make backend server A.I. Chatbot.

import {
  HttpLlm,
  IHttpLlmApplication,
  IHttpLlmFunction,
  LlmTypeChecker,
  OpenApi,
  OpenApiV3,
  OpenApiV3_1,
  SwaggerV2,
} from "@samchon/openapi";
import fs from "fs";
import typia from "typia";
import { v4 } from "uuid";

const main = async (): Promise<void> => {
  // read swagger document and validate it
  const swagger:
    | SwaggerV2.IDocument
    | OpenApiV3.IDocument
    | OpenApiV3_1.IDocument = JSON.parse(
    await fs.promises.readFile("swagger.json", "utf8"),
  );
  typia.assert(swagger); // recommended

  // convert to emended OpenAPI document,
  // and compose LLM function calling application
  const document: OpenApi.IDocument = OpenApi.convert(swagger);
  const application: IHttpLlmApplication = HttpLlm.application(document, {
    keyword: false,
    separate: (schema) =>
      LlmTypeChecker.isString(schema) && schema.contentMediaType !== undefined,
  });

  // Let's imagine that LLM has selected a function to call
  const func: IHttpLlmFunction | undefined = application.functions.find(
    // (f) => f.name === "llm_selected_fuction_name"
    (f) => f.path === "/bbs/articles/{id}" && f.method === "put",
  );
  if (func === undefined) throw new Error("No matched function exists.");

  // actual execution is by yourself
  const article = await HttpLlm.execute({
    connection: {
      host: "http://localhost:3000",
    },
    application,
    function: func,
    arguments: HttpLlm.mergeParameters({
      function: func,
      llm: [
        // LLM composed parameter values
        "general",
        v4(),
        {
          language: "en-US",
          format: "markdown",
        },
        {
          title: "Hello, world!",
          content: "Let's imagine that this argument is composed by LLM.",
        },
      ],
      human: [
        // Human composed parameter values
        { thumbnail: "https://example.com/thumbnail.jpg" },
      ],
    }),
  });
  console.log("article", article);
};
main().catch(console.error);
Enter fullscreen mode Exit fullscreen mode

Top comments (0)