Ever wondered how easy it could be to harness the power of cutting-edge AI models in your projects?
With just 8 lines of Python code, you can start using a powerful Large Language Model (LLM) without diving into the complexities of training one from scratch.
Letβs see how!
Tools we'll be using:
1. Huggingface pretrained model (in this case, falcon)
2. Python
3. Langchain
4. Google Colab
First, open Google Colab and create a new notebook.
Let's start coding:
Step 1:
Install the necessary libraries:
!!pip install langchain huggingface_hub langchain_community
Step 2:
Set up your Hugging Face API token as an environment variable:
import os
os.environ["HUGGINGFACEHUB_API_TOKEN"] = "YOUR_TOKEN"
To get your token:
- Visit Hugging Face and sign in or create an account.
- Navigate to the settings page and select the Access Token tab.
- Create a token and replace "YOUR_TOKEN" with your actual token.
Step 3:
Import HuggingFaceHub from langchain :
from langchain import HuggingFaceHub
Initialize your Large Language Model (LLM):
llm = HuggingFaceHub(repo_id="tiiuae/falcon-7b-instruct", model_kwargs={"temperature":0.6})
Iβm using the tiiuae/falcon-7b-instruct model here, but there are plenty of models available. You can explore them here.
Letβs test the model:
prompt = 'Generate a Python function to print the Fibonacci series. Ensure the code is optimized for efficiency and has minimal time complexity'
response = llm(prompt)
print(response)
and this results into :
def fibonacci(n):
if n == 0:
return 0
elif n == 1:
return 1
else:
return fibonacci(n - 1)+fibonacci(n - 2)
And just like that, with only 8 lines of code, weβve set up our own version of ChatGPT! ππ»
Complete Code
# Install necessary libraries
!pip install langchain huggingface_hub langchain_community
import os
os.environ["HUGGINGFACEHUB_API_TOKEN"] = "YOUR_TOKEN"
from langchain import HuggingFaceHub
# Initialize the model
llm = HuggingFaceHub(repo_id="tiiuae/falcon-7b-instruct", model_kwargs={"temperature":0.6})
# Use the model to generate a response
prompt = 'Generate a Python function to print the Fibonacci series. Ensure the code is optimized for efficiency and has minimal time complexity'
response = llm(prompt)
print(response)
Top comments (9)
"Use a LLM..." would be more appropriate. You're not building a model
Yes, you're right. Thanks for the suggestion!
Building?
You're using an API??
Misleading or just a mistake of wording???
Thanks for sharing your thoughts, The intention behind the title was to highlight how easy it can be to use LLM with minimal code.
I love that you are coding! I love that you are publishing your process! Thank you for inviting me into the world of this powerful sequence, the awesome Fibonacci...
Hehe Thanks!
Take care my friend!
Awesome!
Thanks!