DEV Community

Cover image for AISuite: Simplifying GenAI integration across multiple LLM providers
Vishnu Sivan
Vishnu Sivan

Posted on

AISuite: Simplifying GenAI integration across multiple LLM providers

Generative AI (Gen AI) is reshaping industries with its potential for creativity, problem-solving, and automation. However, developers often face significant challenges when integrating large language models (LLMs) from different providers due to fragmented APIs and configurations. This lack of interoperability complicates workflows, extends development timelines, and hampers the creation of effective Gen AI applications.

To address this, Andrew Ng’s team has introduced AISuite, an open-source Python library that streamlines the integration of LLMs across providers like OpenAI, Anthropic, and Ollama. AISuite enables developers to switch between models with a simple “provider:model” string (e.g., openai:gpt-4o or anthropic:claude-3-5), eliminating the need for extensive code rewrites. By providing a unified interface, AISuite significantly reduces complexity, accelerates development, and opens new possibilities for building versatile Gen AI applications.

In this article, we will explore how AISuite works, its practical applications, and its effectiveness in addressing the challenges of working with diverse LLMs.

Getting Started

Table of contents

  • What is AISuite
  • Why is AISuite important
  • Experimenting with AISuite
  • Creating a Chat Completion
  • Creating a generic function for querying

What is AISuite

AISuite is an open-source Python library developed by Andrew Ng’s team to simplify the integration and management of large language models (LLMs) from multiple providers. It abstracts the complexities of working with diverse APIs, configurations, and data formats, providing developers with a unified framework to streamline their workflows.

Key Features of AISuite:

  • Straightforward Interface: AISuite offers a simple and consistent interface for managing various LLMs. Developers can integrate models into their applications with just a few lines of code, significantly lowering the barriers to entry for Gen AI projects.
  • Unified Framework: By abstracting the differences between multiple APIs, AISuite handles different types of requests and responses seamlessly. This reduces development overhead and accelerates prototyping and deployment.
  • Easy Model Switching: With AISuite, switching between models is as easy as changing a single string in the code. For example, developers can specify a “provider:model” combination like openai:gpt-4o or anthropic:claude-3-5 without rewriting significant parts of their application.
  • Extensibility: AISuite is designed to adapt to the evolving Gen AI landscape. Developers can add new models and providers as they become available, ensuring applications remain up-to-date with the latest AI capabilities.

Why is AISuite Important?

AISuite addresses a critical pain point in the Gen AI ecosystem: the lack of interoperability between LLMs from different providers. By providing a unified interface, it simplifies the development process, saving time and reducing costs. This flexibility allows teams to optimize performance by selecting the best model for specific tasks.

Early benchmarks and community feedback highlight AISuite’s ability to reduce integration time for multi-model applications, improving developer efficiency and productivity. As the Gen AI ecosystem grows, AISuite lowers barriers for experimenting, building, and scaling AI-powered solutions.

Experimenting with AISuite

Lets get started exploring AISuite by installing necessary dependencies.

Installing dependencies

  • Create and activate a virtual environment by executing the following command.
python -m venv venv
source venv/bin/activate #for ubuntu
venv/Scripts/activate #for windows
Enter fullscreen mode Exit fullscreen mode
  • Install aisuite, openai and python-dotenv libraries using pip.
pip install aisuite[all] openai python-dotenv
Enter fullscreen mode Exit fullscreen mode

installation

Setting up environment and credentials

Create a file named .env. This file will store your environment variables, including the OpenAI key.

  • Open the .env file and add the following code to specify your OpenAI API key:
OPENAI_API_KEY=sk-proj-7XyPjkdaG_gDl0_...
GROQ_API_KEY=gsk_8NIgj24k2P0J5RwrwoOBW...
Enter fullscreen mode Exit fullscreen mode
  • Add API keys to the environment variables.
import os
from dotenv import load_dotenv
load_dotenv()
os.environ['OPENAI_API_KEY'] = os.getenv('OPENAI_API_KEY')
os.environ['ANTHROPIC_API_KEY'] = getpass('Enter your ANTHROPIC API key: ')
Enter fullscreen mode Exit fullscreen mode

Initialize the AISuite Client

Create an instance of the AISuite client, enabling standardized interaction with multiple LLMs.

client = ai.Client()
Defining the prompt
The prompt syntax closely resembles OpenAI’s structure, incorporating roles and content.

messages = [
   {"role": "system", "content": "You are a helpful assistant."},
   {"role": "user", "content": "Tell a joke in 1 line."}
]
Enter fullscreen mode Exit fullscreen mode

Querying the model

User can query the model using AISuite as follows.

# openai model
response = client.chat.completions.create(model="openai:gpt-4o", messages=messages, temperature=0.75)
# ollama model
response = client.chat.completions.create(model="ollama:llama3.1:8b", messages=messages, temperature=0.75)
# anthropic model
response = client.chat.completions.create(model="anthropic:claude-3-5-sonnet-20241022", messages=messages, temperature=0.75)
# groq model
response = client.chat.completions.create(model="groq:llama-3.2-3b-preview", messages=messages, temperature=0.75)
print(response.choices[0].message.content)
Enter fullscreen mode Exit fullscreen mode
  • model="openai:gpt-4o": Specifies type and version of the model.
  • messages=messages: Sends the previously defined prompt to the model.
  • temperature=0.75: Adjusts the randomness of the response. Higher values encourage creative outputs, while lower values produce more deterministic results.
  • response.choices[0].message.content: Retrieves the text content from the model's response.

Creating a Chat Completion

Lets create a chat completion code using OpenAI model.

import os
from dotenv import load_dotenv
load_dotenv()
os.environ['OPENAI_API_KEY'] = os.getenv('OPENAI_API_KEY')

import aisuite as ai

client = ai.Client()

provider = "openai"
model_id = "gpt-4o"

messages = [
    {"role": "system", "content": "You are a helpful assistant"},
    {"role": "user", "content": "Provide an overview of the latest trends in AI"},
]

response = client.chat.completions.create(
    model = f"{provider}:{model_id}",
    messages = messages,
)

print(response.choices[0].message.content)
Enter fullscreen mode Exit fullscreen mode
  • Run the app using the following command.
python app.py
Enter fullscreen mode Exit fullscreen mode

You will get output as follows,

output

Creating a generic function for querying

Instead of writing separate code for calling different models, let’s create a generic function to eliminate code repetition and improve efficiency.

def ask(message, sys_message="You are a helpful assistant", model="openai:gpt-4o"):
    client = ai.Client()
    messages = [
        {"role": "system", "content": sys_message},
        {"role": "user", "content": message}
    ]
    response = client.chat.completions.create(model=model, messages=messages)
    return response.choices[0].message.content

print(ask("Provide an overview of the latest trends in AI"))
Enter fullscreen mode Exit fullscreen mode

The ask function is a reusable utility designed for sending queries to an AI model. It accepts the following parameters:

  • message: The user's query or prompt. sys_message (optional): A system-level instruction to guide the model's behavior.
  • model: Specifies the AI model to be used. The function processes the input parameters, sends them to the specified model, and returns the AI’s response, making it a versatile tool for interacting with various models.

Below is the complete code for interacting with the OpenAI model using the generic ask function.

import os
from dotenv import load_dotenv
load_dotenv()
os.environ['OPENAI_API_KEY'] = os.getenv('OPENAI_API_KEY')

import aisuite as ai

def ask(message, sys_message="You are a helpful assistant", model="openai:gpt-4o"):
    client = ai.Client()
    messages = [
        {"role": "system", "content": sys_message},
        {"role": "user", "content": message}
    ]
    response = client.chat.completions.create(model=model, messages=messages)
    return response.choices[0].message.content

print(ask("Provide an overview of the latest trends in AI"))
Enter fullscreen mode Exit fullscreen mode

Running the code will produce the following output.

output

Interacting with multiple APIs

Let’s explore interacting with multiple models using AISuite through the following code.

import os
from dotenv import load_dotenv
load_dotenv()
os.environ['OPENAI_API_KEY'] = os.getenv('OPENAI_API_KEY')
os.environ['GROQ_API_KEY'] = os.getenv('GROQ_API_KEY')

import aisuite as ai

def ask(message, sys_message="You are a helpful assistant", model="openai:gpt-4o"):
    client = ai.Client()
    messages = [
        {"role": "system", "content": sys_message},
        {"role": "user", "content": message}
    ]
    response = client.chat.completions.create(model=model, messages=messages)
    return response.choices[0].message.content

print(ask("Who is your creator?"))
print(ask('Who is your creator?', model='ollama:qwen2:1.5b'))
print(ask('Who is your creator?', model='groq:llama-3.1-8b-instant'))
print(ask('Who is your creator?', model='anthropic:claude-3-5-sonnet-20241022'))
Enter fullscreen mode Exit fullscreen mode

There may be challenges when interacting with providers like Anthropic or Groq. Hopefully, the AISuite team is actively addressing these issues to ensure seamless integration and functionality.

AISuite is a powerful tool for navigating the landscape of large language models. It enables users to leverage the strengths of multiple AI providers while streamlining development and encouraging innovation. With its open-source foundation and intuitive design, AISuite stands out as a cornerstone for modern AI application development.

Thanks for reading this article !!

Thanks Gowri M Bhatt for reviewing the content.

If you enjoyed this article, please click on the heart button ♥ and share to help others find it!

The full source code for this tutorial can be found here,

GitHub - codemaker2015/aisuite-examples : github.com

Resources

GitHub - andrewyng/aisuite: Simple, unified interface to multiple Generative AI providers : github.com

Top comments (0)