## DEV Community

Durvesh Danve

Posted on • Updated on

# Use any LLM with Just 8 Lines of Code π

Ever wondered how easy it could be to harness the power of cutting-edge AI models in your projects?

With just 8 lines of Python code, you can start using a powerful Large Language Model (LLM) without diving into the complexities of training one from scratch.

Letβs see how!

Tools we'll be using:

``````1. Huggingface pretrained model (in this case, falcon)
2. Python
3. Langchain
``````

First, open Google Colab and create a new notebook.

Let's start coding:

Step 1:
Install the necessary libraries:

``````!!pip install langchain huggingface_hub langchain_community
``````

Step 2:
Set up your Hugging Face API token as an environment variable:

``````import os
os.environ["HUGGINGFACEHUB_API_TOKEN"] = "YOUR_TOKEN"
``````

2. Navigate to the settings page and select the Access Token tab.
3. Create a token and replace "YOUR_TOKEN" with your actual token.

Step 3:
Import HuggingFaceHub from langchain :

``````from langchain import HuggingFaceHub
``````

Initialize your Large Language Model (LLM):

``````llm = HuggingFaceHub(repo_id="tiiuae/falcon-7b-instruct", model_kwargs={"temperature":0.6})
``````

Iβm using the tiiuae/falcon-7b-instruct model here, but there are plenty of models available. You can explore them here.

Letβs test the model:

``````prompt = 'Generate a Python function to print the Fibonacci series. Ensure the code is optimized for efficiency and has minimal time complexity'
response = llm(prompt)
print(response)
``````

and this results into :

``````def fibonacci(n):
if n == 0:
return 0
elif n == 1:
return 1
else:
return fibonacci(n - 1)+fibonacci(n - 2)
``````

And just like that, with only 8 lines of code, weβve set up our own version of ChatGPT! ππ»

Complete Code

``````# Install necessary libraries
!pip install langchain huggingface_hub langchain_community

import os
os.environ["HUGGINGFACEHUB_API_TOKEN"] = "YOUR_TOKEN"

from langchain import HuggingFaceHub

# Initialize the model
llm = HuggingFaceHub(repo_id="tiiuae/falcon-7b-instruct", model_kwargs={"temperature":0.6})

# Use the model to generate a response
prompt = 'Generate a Python function to print the Fibonacci series. Ensure the code is optimized for efficiency and has minimal time complexity'
response = llm(prompt)
print(response)
``````

Red Ochsenbein (he/him)

"Use a LLM..." would be more appropriate. You're not building a model

Durvesh Danve

Yes, you're right. Thanks for the suggestion!

Alana E

Building?

You're using an API??

Misleading or just a mistake of wording???

Durvesh Danve

Thanks for sharing your thoughts, The intention behind the title was to highlight how easy it can be to use LLM with minimal code.

ANNA LAPUSHNER

I love that you are coding! I love that you are publishing your process! Thank you for inviting me into the world of this powerful sequence, the awesome Fibonacci...

Durvesh Danve

Hehe Thanks!

ANNA LAPUSHNER

Take care my friend!

Durvesh Danve

Thanks!