DEV Community

Alvaro David
Alvaro David

Posted on • Updated on

PoseNet - TensorFlow.js on GCP

TensorFlow.js is a JavaScript Library for training and deploying machine learning models in the browser and in Node.js.

Fortunately, TensorFlow.js has ready-to-use models that can be used by anyone, especially for developers with no knowledge of ML.

One of them is PoseNet, a standalone model for running real-time pose estimation in the browser using TensorFlow.js.

For browsers PoseNet works perfectly, the challenge comes when we want to us it on the backend, I made a simple API to use the model and I want to share with you a couple of issues that I had to resolve in order to deploy the API on GCP. (complete source at the end of the post)

TensorFlow vs Docker

Docker is a powerful tool. However, use TensorFlow.js on a docker is little bit harder, messages like Your CPU supports instructions that this TensorFlow binary was not compiled to use... are very frequently.

Production considerations

From documentation: The Node.js bindings provide a backend for TensorFlow.js that implements operations synchronously. That means when you call an operation, e.g. tf.matMul(a, b), it will block the main thread until the operation has completed.

For this reason, the bindings currently are well suited for scripts and offline tasks. If you want to use the Node.js bindings in a production application, like a webserver, you should set up a job queue or set up worker threads so your TensorFlow.js code will not block the main thread.

So... How to resolve this?

Both issues have their own solution, build a detailed Dockerfile and use worker threads for example, but there is one resource that will resolve both with one shot:

Cloud Functions

  1. Cloud Functions instances already have TensorFlow compiled. (One example that I found was How to serve deep learning models using TensorFlow 2.0 with Cloud Functions)

  2. Each instance of a Cloud Function handles only one concurrent request at a time. This means that while your code is processing one request, there is no possibility of a second request being routed to the same instance. Thus the original request can use the full amount of resources (CPU and memory) that you requested. (Source: Auto-scaling and Concurrency)

Some consideration to take are:

  • Set the max capacity of memory available.
  • Starting a new Cloud Functions instance involves loading the runtime and your code. Requests that include Cloud Functions instance startup (cold starts) can be slower than requests hitting existing instances.

And... yes, the request is going to be slower than running on a Kubernetes for example. But this solution is a Quick win for developers, we are using a trained model in a Serverless environment.
Developers with no knowledge of ML can take advantage for this technology in an easy way.
For future posts I will use the other solutions: detailed Dockerfile and worker threads. Be patient :).

That's it, a start for developers into the ML world.

Code

This API receives a "gs://" path from Google Cloud Storage.
I made it that way because Cloud Functions can be invoked indirectly in response to an event from Cloud Storage or Pub/Sub.

Source: https://github.com/AlvarDev/posenet-nodejs-gcp

index.js

// Express
const bodyParser = require("body-parser");
const express = require('express');
const app = express();

// Utils
const path = require("path");
const os = require('os');
const fs = require('fs');

// GCS 
const { Storage } = require("@google-cloud/storage");
const storage = new Storage();

// Tensorflow
const tf = require('@tensorflow/tfjs-node');
const posenet = require('@tensorflow-models/posenet');
const { createCanvas, Image } = require('canvas');

/**
 * Validates that the request body has the "image" attr.
 *
 * @param {Object} req Cloud Function request context.
 *                     More info: https://expressjs.com/en/api.html#req
 * @param {Object} res Cloud Function response context.
 *                     More info: https://expressjs.com/en/api.html#res
 * @param {Function} next Function to be execute after validateReq
 */
function validateReq(req, res, next) {
  if (!("image" in req.body)) {
    res.status(404).send({ message: "image GCS path not found" });
    return;
  }

  next();
}

/**
 * Estimate the poses from an image.
 * 
 * @param {String} imagePath Image local path
 * 
 * @returns {Object} poses estimations
 */
async function estimatePose(imagePath) {

  const net = await posenet.load({
    architecture: 'MobileNetV1',
    outputStride: 16,
    inputResolution: { width: 640, height: 480 },
    multiplier: 0.75
  });

  const img = new Image();
  img.src = imagePath;

  const canvas = createCanvas(img.width, img.height);
  const ctx = canvas.getContext('2d');
  ctx.drawImage(img, 0, 0);
  const input = tf.browser.fromPixels(canvas);
  return await net.estimateSinglePose(input, { flipHorizontal: false });
} 

/**
 * Request Handler.
 *
 * @param {Object} req Cloud Function request context.
 *                     More info: https://expressjs.com/en/api.html#req
 * @param {Object} res Cloud Function response context.
 *                     More info: https://expressjs.com/en/api.html#res
 */
async function getPose(req, res){

  // Getting Image path, Cloud use a Regex
  const attrs = req.body.image.split('/');
  const bucketName = attrs[2];
  const filename = attrs[attrs.length - 1];
  const imageGCS = req.body.image.replace(`gs://${bucketName}/`, "");
  const imagePath = path.join(os.tmpdir(), filename);

  // Downlaod from GCS
  try {
    await storage
      .bucket(bucketName)
      .file(imageGCS)
      .download({ destination: imagePath });

  } catch (err) {
    console.log(err.message);
    res.status(404).send({ message: "File not found" });
    return;
  }

  const pose = await estimatePose(imagePath);

  // Remove image
  fs.unlinkSync(imagePath);
  res.status(200).send(pose);
}

app.use(bodyParser.urlencoded({extended: true}));
app.use(bodyParser.json());
app.get("/", (_, res) => { res.send("Hello World!"); });
app.post("/get-poses", validateReq, (req, res) => {
  getPose(req, res);
});
app.use((err, req, res, next) => {
  console.error(err.stack)
  res.status(500).send({message: "Something went wrong"})
})

exports.app = app;

// For localhost test
// app.listen(8080, () => {
//   console.log(`App listening on port 8080`);
//   console.log('Press Ctrl+C to quit.');
// });
Enter fullscreen mode Exit fullscreen mode

package.json (dependencies)

"engines": {
    "node": ">=10.0.0"
  },
"dependencies": {
    "@google-cloud/storage": "4.7.0",
    "@tensorflow/tfjs-node": "2.0.1",
    "@tensorflow-models/posenet": "2.2.1",
    "body-parser": "1.19.0",
    "canvas": "2.6.1",
    "express": "4.17.1"
  }
Enter fullscreen mode Exit fullscreen mode

To test locally you can uncomment the app.listen() method

// For localhost test
app.listen(8080, () => {
  console.log(`App listening on port 8080`);
  console.log('Press Ctrl+C to quit.');
});
Enter fullscreen mode Exit fullscreen mode

execute

node index.js
Enter fullscreen mode Exit fullscreen mode

test locally

curl -X POST \
    http://localhost:8080/get-poses \
    -H 'Content-Type: application/json' \
    -d '{"image": "gs://your-bucket/karate.jpg"}'
Enter fullscreen mode Exit fullscreen mode

deploy to Cloud Functions

gcloud functions deploy app --runtime nodejs10 --trigger-http --allow-unauthenticated
Enter fullscreen mode Exit fullscreen mode

Top comments (0)