DEV Community

Cover image for Use any LLM with Just 8 Lines of Code ๐Ÿš€
Durvesh Danve
Durvesh Danve

Posted on • Updated on

Use any LLM with Just 8 Lines of Code ๐Ÿš€

Ever wondered how easy it could be to harness the power of cutting-edge AI models in your projects?

With just 8 lines of Python code, you can start using a powerful Large Language Model (LLM) without diving into the complexities of training one from scratch.

Letโ€™s see how!

Tools we'll be using:

1. Huggingface pretrained model (in this case, falcon)
2. Python
3. Langchain
4. Google Colab
Enter fullscreen mode Exit fullscreen mode

First, open Google Colab and create a new notebook.

Let's start coding:

Step 1:
Install the necessary libraries:

!!pip install langchain huggingface_hub langchain_community
Enter fullscreen mode Exit fullscreen mode

Step 2:
Set up your Hugging Face API token as an environment variable:

import os
os.environ["HUGGINGFACEHUB_API_TOKEN"] = "YOUR_TOKEN"
Enter fullscreen mode Exit fullscreen mode

To get your token:

  1. Visit Hugging Face and sign in or create an account.
  2. Navigate to the settings page and select the Access Token tab.
  3. Create a token and replace "YOUR_TOKEN" with your actual token.

Huggingface Access Token section

Step 3:
Import HuggingFaceHub from langchain :

from langchain import HuggingFaceHub
Enter fullscreen mode Exit fullscreen mode

Initialize your Large Language Model (LLM):

llm = HuggingFaceHub(repo_id="tiiuae/falcon-7b-instruct", model_kwargs={"temperature":0.6})
Enter fullscreen mode Exit fullscreen mode

Iโ€™m using the tiiuae/falcon-7b-instruct model here, but there are plenty of models available. You can explore them here.

Letโ€™s test the model:

prompt = 'Generate a Python function to print the Fibonacci series. Ensure the code is optimized for efficiency and has minimal time complexity'
response = llm(prompt)
print(response)
Enter fullscreen mode Exit fullscreen mode

and this results into :

def fibonacci(n):
    if n == 0:
        return 0
    elif n == 1:
        return 1
    else:
        return fibonacci(n - 1)+fibonacci(n - 2)
Enter fullscreen mode Exit fullscreen mode

And just like that, with only 8 lines of code, weโ€™ve set up our own version of ChatGPT! ๐ŸŽ‰๐Ÿ’ป

Complete Code

# Install necessary libraries
!pip install langchain huggingface_hub langchain_community

import os
os.environ["HUGGINGFACEHUB_API_TOKEN"] = "YOUR_TOKEN"

from langchain import HuggingFaceHub

# Initialize the model
llm = HuggingFaceHub(repo_id="tiiuae/falcon-7b-instruct", model_kwargs={"temperature":0.6})

# Use the model to generate a response
prompt = 'Generate a Python function to print the Fibonacci series. Ensure the code is optimized for efficiency and has minimal time complexity'
response = llm(prompt)
print(response)
Enter fullscreen mode Exit fullscreen mode

Top comments (9)

Collapse
 
syeo66 profile image
Red Ochsenbein (he/him)

"Use a LLM..." would be more appropriate. You're not building a model

Collapse
 
durvesh_danve profile image
Durvesh Danve

Yes, you're right. Thanks for the suggestion!

Collapse
 
alanarmrfme profile image
Alana E

Building?

You're using an API??

Misleading or just a mistake of wording???

Collapse
 
durvesh_danve profile image
Durvesh Danve

Thanks for sharing your thoughts, The intention behind the title was to highlight how easy it can be to use LLM with minimal code.

Collapse
 
anna_lapushner profile image
ANNA LAPUSHNER

I love that you are coding! I love that you are publishing your process! Thank you for inviting me into the world of this powerful sequence, the awesome Fibonacci...

Collapse
 
durvesh_danve profile image
Durvesh Danve

Hehe Thanks!

Collapse
 
anna_lapushner profile image
ANNA LAPUSHNER

Take care my friend!

Collapse
 
tanmay_borde_f9ab7ebbaa14 profile image
Tanmay Borde

Awesome!

Collapse
 
durvesh_danve profile image
Durvesh Danve

Thanks!