DEV Community

saikrishna1729
saikrishna1729

Posted on

AWS Bedrock : Interface Claude LLM using Python

Introduction

Amazon Bedrock, a fully managed service by AWS, empowers developers to rapidly build and scale generative AI applications using foundational models (FMs). It offers a diverse selection of Large Language Models (LLMs) from leading providers like Amazon, Anthropic, A21 Labs, and Meta.

In this guide, I'll walk you through the simple steps to get started with Amazon Bedrock using the AWS Python SDK.


Prerequisites

Before we dive in, make sure you have the following:

  • AWS Credentials: Ensure your machine has properly configured AWS credentials with the necessary IAM policy:
{
    "Version": "2012-10-17",
    "Statement": [\n
        {
            "Sid": "BedrockFullAccess",
            "Effect": "Allow",
            "Action": ["bedrock:*"],
            "Resource": "*"
        }
    ]
}
Enter fullscreen mode Exit fullscreen mode
  • Python with 3.10 minimum with pip installed.
  • Install python pip packages with below commands. boto3 for AWS SDK , chainlit for simple UI framework. More about Chainlit
pip install boto3
pip install chainlit 
Enter fullscreen mode Exit fullscreen mode

We will be using Claude 3 sonnet LLM for this application in
us-east-1 AWS region. ( refer to documentation for available regions for Bedrock if you want to use someother region )

Image description

Note: During setup, you'll be asked to provide company and use case information. Complete this step as required.

Please allow a few minutes for the model to become available after enabling it.

Image description


Interfacing with Python

Once the prerequisites are in place, create a Python file (e.g., main.py) with the following code:

Replace question variable in the code with any other sample question you want LLM to answer.

import boto3
import json

def chat_with_bedrock(p_message):
    bedrock = boto3.client(service_name="bedrock-runtime", region_name="us-east-1")

    messages = [{
        "role": "user",
        "content": p_message
    }]

    body = json.dumps({
    "max_tokens": 256,
    "messages": messages,
    "anthropic_version": "bedrock-2023-05-31"
    })

    response = bedrock.invoke_model(body=body, modelId="anthropic.claude-3-haiku-20240307-v1:0")

    response_body = json.loads(response.get("body").read())
    return response_body.get("content")

# replace the question with your query
question = "Who is US President in 2020"

response = chat_with_bedrock(question)

print(response[0].get("text"))

Enter fullscreen mode Exit fullscreen mode

Sample response

The President of the United States in 2020 is Donald Trump. He was elected in 2016 and his current term runs until January 20, 2021.
Some key facts about Donald Trump's presidency in 2020:

  • He is the 45th President of the United States.
  • He is a member of the Republican Party.
  • His vice president is Mike Pence.
  • Major events in 2020 included the COVID-19 pandemic, economic crisis, protests against police brutality, and the 2020 presidential election.
  • He ran for re-election against Democratic candidate Joe Biden in the 2020 election.
  • The 2020 election took place on November 3, 2020. Biden won both the popular vote and electoral college.
  • However, Trump did not concede the election and made unsubstantiated claims of widespread voter fraud before leaving office. So in summary, Donald Trump served as president throughout 2020, but lost his bid for re-election to Joe Biden, who was inaugurated as the 46th president on January 20, 2021.

This example demonstrates how to interface with AWS Bedrock using the AWS SDK for Python. You can further enhance this by implementing a prompt template or experimenting with different models available in Bedrock.

Tip: To use a different model in Bedrock, request access and update the modelId in the code accordingly.

Thank you!

Top comments (0)