DEV Community

Cover image for Building the backend - Part II (Live tweet sentiment analysis)
Amal Shaji
Amal Shaji

Posted on • Originally published at amalshaji.wtf

Building the backend - Part II (Live tweet sentiment analysis)

In the previous article, I explained how to build a sentiment classifier. Read it here. I this, we'll create the backend to serve predictions via an API.

Building the API

make sure you're in the backend folder

API

# server.py

import nltk
import uvicorn
import utils as utils
from fastapi import FastAPI
from pydantic import BaseModel
from classify import remove_noise
from nltk.tokenize import word_tokenize

app = FastAPI()
classifier = utils.load_model()

class Tweet(BaseModel):
    tweet: str

@app.get("/")
def read_root():
    return {"message": "Welcome to sentiment classifier API"}

@app.post("/api")
def analyse_tweet(tweet: Tweet):
    custom_tokens = remove_noise(word_tokenize(tweet.tweet))
    result = classifier.classify(dict([token, True] for token in custom_tokens))
    return {"sentiment": result}

if __name__ == "__main__":
    uvicorn.run("server:app", host="0.0.0.0", log_level="info")
Enter fullscreen mode Exit fullscreen mode

This is similar to the other articles where I showed how to build APIs using FastAPI. Only change is that, In the previous ones, we launched the server using uvicorn main: app, here we're doing it programmatically. This way we can run the server using python server.py

Dockerize the API

FROM python:3.8-slim

WORKDIR /app

COPY requirements.txt .

RUN pip install -r requirements.txt

COPY . .

RUN python3 -m nltk.downloader punkt
RUN python3 -m nltk.downloader wordnet
RUN python3 -m nltk.downloader stopwords
RUN python3 -m nltk.downloader averaged_perceptron_tagger

EXPOSE 8000

CMD ["python3", "server.py"]
Enter fullscreen mode Exit fullscreen mode

Build and run the image

docker built -t sentwitter-backend .
docker run -d -p 8000:8000 setwitter-backend
Enter fullscreen mode Exit fullscreen mode

Test the API

❯ docker run -d -p 8000:8000 sentwitter_backend
ccd9fa0740cfd0f497308ef570e16c950233c29378029aac61c1479cc94163bc

❯ curl http://localhost:8000
{"message":"Welcome to sentiment classifier API"}

❯ curl -X POST "http://localhost:8000/api" -H  "accept: application/json" -H  "Content-Type: application/json" -d "{\"tweet\":\"I hate you\"}"
{"sentiment": "Negative"}
Enter fullscreen mode Exit fullscreen mode

We have successfully built the backend. In the next article, we'll build a frontend that retrieves tweets and sent to the backend for prediction.

References

Top comments (0)