DEV Community

Cover image for Running LLMs on Android: A Step-by-Step Guide
Davide Fornelli
Davide Fornelli

Posted on • Updated on • Originally published at davidefornelli.com

Running LLMs on Android: A Step-by-Step Guide

In this blog post, we'll explore how to install and run the Ollama language model on an Android device using Termux, a powerful terminal emulator. This tutorial is designed for users who wish to leverage the capabilities of large language models directly on their mobile devices without the need for a desktop environment.

Step 1: Install F-Droid

F-Droid is an installable catalogue of FOSS (Free and Open Source Software) applications for the Android platform. Download and install the F-Droid app from https://f-droid.org/.

Step 2: Install Termux

Once F-Droid is installed, open it and search for Termux. Install Termux, which will serve as your Linux environment on Android.

Step 3: Update and Upgrade Packages

Open Termux and update its package repository to ensure you have access to the latest software versions:

pkg update
pkg upgrade
Enter fullscreen mode Exit fullscreen mode

Step 4: Install Proot-Distro and Debian

Proot-Distro allows you to run different Linux distributions within Termux. Install it using:

pkg install proot-distro
Enter fullscreen mode Exit fullscreen mode
pd install debian
Enter fullscreen mode Exit fullscreen mode

For more details, visit the Proot-Distro GitHub page.

Step 5: Login to a Linux Distribution

For this tutorial, we will use Debian. Log in to your Debian environment with:

pd login debian
Enter fullscreen mode Exit fullscreen mode

Step 6: Install Tmux

Tmux is a terminal multiplexer that allows you to run multiple terminal sessions simultaneously:

apt update
apt upgrade
apt install tmux
Enter fullscreen mode Exit fullscreen mode

Step 7: Install Ollama

Within the Debian environment, download and install Ollama:

curl -fsSL https://ollama.com/install.sh | sh
Enter fullscreen mode Exit fullscreen mode

This script installs Ollama and its dependencies.

Step 8: Create a New Tmux Session

Start a new tmux session dedicated to running Ollama:

tmux new -s llm
Enter fullscreen mode Exit fullscreen mode

Step 9: Run Ollama Server

In the tmux session, start the Ollama server:

ollama serve
Enter fullscreen mode Exit fullscreen mode

Step 10: Create a New Pane in Tmux

Split the current window into two panes by pressing Ctrl+b ".

Step 11: Run an Ollama Model

In the new pane, run a specific Ollama model. For example, to run the Gemma 2B model:

ollama run gemma:2b
Enter fullscreen mode Exit fullscreen mode

Depending on your device, you can run also run phi3:

ollama run phi3
Enter fullscreen mode Exit fullscreen mode

Step 12: Test the Model

Once the model is running, you can interact with it directly from the terminal. Enter prompts and observe the model's responses, effectively using your Android device as a powerful AI tool.

Conclusion

This tutorial provides a foundation for exploring further applications of LLMs on mobile devices, opening up possibilities for mobile-based AI development and experimentation.

Top comments (2)

Collapse
 
aandea profile image
Alin Andea

Before running "pd login debian", the user should install it using "pd install debian".

Collapse
 
davidefornelli profile image
Davide Fornelli

Hi Alin, thank you for the heads up!