DEV Community


How to setup a cloud deployable machine learning model using Flask 🚀

I am a software developer mostly working with .net technologies and modern javascript frameworks. Currently learning python for data science.
・Updated on ・4 min read

A well trained machine learnig model is not useful untill we are able to use it to predict on the future data. These kind of predictive models are used to solve several problems. So, we are going to learn how to use a trained machine learning model to predict on future data which will be feed into the model through an API.

In order to use a maching learning model through a callable API, we need to have a machine learning model in the first place. In this post we are not concentrating to create a model from the scratch (I will write another post on how to create a basic artificial neural network model using tensorflow) instead, we are going to use a model trained using tensorflow and keras libraries on the 'Iris' (flower) dataset. It is very famous dataset and we know that it has 3 classes (setosa, versicolor, virginica) present on the target variable.

Another thing we need is the Flask library to create the API. So. let's install flask usnig pip.
pip install flask

After installing flask, let's open sublime text3/ any text editor and create a python file. I am going to name this file as Then I am going to test if the environment is properly setup by creating a basic flask application with only one '/' route.

from flask import Flask

app = Flask(__name__)

def index():
    return '<h1>FLASK APP IS RUNNING!</h1>'

if __name__ == '__main__':
Enter fullscreen mode Exit fullscreen mode

Now, let's open the command prompt in the current working directory where the file is located and run the program using python command. We should be able to see in the command prompt like below.

Alt Text

If we open in the browser, then we should be able to see the app is running. If you see like below then we are set to proceed.

Alt Text

Now, we are going to follow below steps to create an API using Flask and call it through python script.

  • First, we will load the model using tensorflow.keras load_model() method.
  • Then, we will load the fitted Scaler object (i saved as pickle file) using joblib load() method to transform the new incoming data.
  • Create a function to be able to take a model object, a scaler object and the new data to predict.
  • Create an API endpoint using Flask .
  • Test the API using postman.
  • Call the API using Python script and get the prediction.

Load the model

We are going to import load_model from tensorflow.keras.models as the model is build using tensorflow. The model can be saved by any method.

from tensorflow.keras.models import load_model

Then we will use below code to load the model in my variable called flower_model. Here, I saved my model as my_model.h5 in the current working directory.

flower_model = load_model("my_model.h5")

Load the scaler object

I saved my fitted MinMaxScaler object in a pickle file in this current working directory as my_scaler.pkl. So, I am going to use joblib to load this scaler object in my variable called flower_scaler. Before that, we need to import the joblib module.

import joblib
flower_scaler = joblib.load("my_scaler.pkl")
Enter fullscreen mode Exit fullscreen mode

Creating a function that return prediction

We are going to create a function that can take a model, a scaler and a data. The data is going to be in JSON format. So, we are going to read each field and will save into variables. This way we can create a list with similar shape of the training features. We have 4 features in this case. These are sepal_length, sepal_width,petal_length,petal_width. Then we will transform the data using scaler object and will use the model to predict the classes. The function code is given below.

def return_prediction(model, scaler, data):
    s_len = data["sepal_length"]    
    s_wid = data["sepal_width"]    
    p_len = data["petal_length"]    
    p_wid = data["petal_width"]

    classes = np.array(['setosa', 'versicolor', 'virginica'])

    flower = [[s_len, s_wid, p_len, p_wid]]

    flower = scaler.transform(flower)

    class_ind = model.predict_classes(flower)[0]

    return classes[class_ind]
Enter fullscreen mode Exit fullscreen mode

We need to import numpy as np in the top in order to convert the list labels (['setosa', 'versicolor', 'virginica']) into numpy array

Creating API endpoint using Flask

I am going to create a route like /api/flower which can respond to a HTTP Post method. I am going to use request module to extract the JSON data posted by a client. Then I am going to use return_prediction() function to predict on this new data and return a JSON object using jsonify.

@app.route("/api/flower", methods=['POST'])
def flower_prediction():
    content = request.json
    result = return_prediction(flower_model, flower_scaler, content)
    return jsonify(result)
Enter fullscreen mode Exit fullscreen mode

We need to import request and jsonify modules from flask library

from flask import request, jsonify

Testing the API using Postman

Now, I am going to run the program using below command in command prompt from the working directory. python This will launch the server and we are ready to test the API.

Now, let's open Postman and issue a post request to the endpoint We will create a body of the request in JSON format and will send the request.

Enter fullscreen mode Exit fullscreen mode

Alt Text

We can see the status code is 200 OK and the API has returned it's prediction as 'setosa' based on the input data.

Calling the API using Python script

Next, we are going to write a python script to call the API. We need to import request library in order to do that.

import requets

Then we will create a dictionary object to pass through the API.

flower_example = {
Enter fullscreen mode Exit fullscreen mode

Finally, we will call the API using post method and print the result.

result ="", json=flower_example)
Enter fullscreen mode Exit fullscreen mode

Alt Text

That's all for today. We will talk about how to deploy this model into cloud in my next post.
My Twitter for any suggestion
Have a great one!

Discussion (0)