DEV Community

Cover image for Neuralink for your AI Agents
Onur Atakan ULUSOY
Onur Atakan ULUSOY

Posted on

Neuralink for your AI Agents

Image description


Tiger is a community project for developing a reusable and integrated tool ecosystem for the LLM Agent Revolution. Tiger uses Upsonic for storing tools in isolation and automatically generating documents. You can create your own tiger for your agents or you can use the community maintained public and strongest Tiger 🐅.

Tiger is inspired from Neuralink and provides an AI computer interface with its threads that connect to the LLM interface. Tiger gives an opportunity to AI to use a computer by thinking.

With Tiger, your LLM agents write codes and run, use search engines, view your calendar, use your mouse and keyboard. Saying something in your headphones and anything else thought by your agent. Tiger will transform these thoughts into real actions. In this way, the Tiger project's philosophy is using AI knowledge to generate action and supporting with standard infrastructures. We aim to make:

  • A Utility point for tools for agents in any framework that has a function call mechanism
  • A Great community for great tools support in different technologies and sources
  • Free, Open and MIT licensed tool library for the AI agent ecosystem.


Tiger projects have a general usage public library at Its include the tools that in tools library. For usage this you can use the standart connection that in upsonic python library. After installing the upsonic library we will use the Tiger object wand integrate to your agents.

  • Tiger requires equal or higher python version to 3.8
pip3 install upsonic
Enter fullscreen mode Exit fullscreen mode

Currently Tools

We are working on Upsonic and the tools that inside the tools folder is sending to public tiger in each release. We are aiming to create tools without any api key and just like normal human events like searching on google with mouse, keyboard and browser.

  • Interpreter

    • python
    • check_package
    • execute
    • install_package
  • Search

    • google
    • read_website

If you want to add functions to public and strongest Tiger you can see to Adding Tools section.

Public Dashboard

For the public Tiger you can see the functions and their documentations and readmes in You can use this place for documentation also.


  • username: tiger
  • password: tiger

LangChain Integration

Tiger is able to make a collabration for sharing tools with LangChain agents with this your agents will able to use Tiger functions. In this example we are asking for an multiplation question and the agent will use the tiger interpreter.python module and after that its write a python code and tiger will give the result in behind. With this agent will able to make mathematical operations in just two lines of code.

# Geting the tiger tools about interpreter.python
from upsonic import Tiger
tools = Tiger().langchain()

# Generating Agent and executor with tiger tool set
from langchain_openai import ChatOpenAI
from langchain import hub
from langchain.agents import AgentExecutor, create_openai_functions_agent

llm = ChatOpenAI(model="gpt-4", api_key="OPENAI_API_KEY")
prompt = hub.pull("hwchase17/openai-functions-agent")
agent = create_openai_functions_agent(llm, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)

# Asking for 15231 * 64231
agent_executor.invoke({"input": "What is the result of 15231 * 64231"})

Enter fullscreen mode Exit fullscreen mode

AutoGen Integration

Tiger is also have a integration with AutoGen agents. You can put a tiger to your AutoGen agents. In this examples we will use the 'interpreter.python' module and with this your autogen agent able to run and view result of python codes. With this your agent will able to wait 2 second as we request.

# Generating Agents with tiger tool set
from typing_extensions import Annotated
import autogen

config_list = [
        'model': 'gpt-4',
        'api_key': 'OPENAI_API_KEY',

llm_config = {
    "config_list": config_list,
    "timeout": 120,
chatbot = autogen.AssistantAgent(
    system_message="For coding tasks, only use the functions you have been provided with. Reply TERMINATE when the task is done.",

user_proxy = autogen.UserProxyAgent(
    is_termination_msg=lambda x: x.get("content", "") and x.get("content", "").rstrip().endswith("TERMINATE"),

# Geting the tiger tools about interpreter.python
from upsonic import Tiger
Tiger().autogen(chatbot, userproxy)

# Asking sleep 2 second
        message="What is",
Enter fullscreen mode Exit fullscreen mode

Top comments (0)