DEV Community

Cover image for FruitifyMe! A deep learning fruitifier that uses Github actions, AWS, tensorflow and React.JS
Fady GA ๐Ÿ˜Ž
Fady GA ๐Ÿ˜Ž

Posted on

FruitifyMe! A deep learning fruitifier that uses Github actions, AWS, tensorflow and React.JS

What I built

I've built an app that tells you your "fruity" resemblance (what type of fruit(s) your photo resembles)

Category Submission:

Wacky Wildcards

App Link

http://fruitifyme.com.s3-website-us-east-1.amazonaws.com

Screenshots

App usage

App usage

Model Training:

Training

Promoting Models:

Promote

Model Testing and Deployment with Github Actions:

Testing and Deployment

Site Deployment with Github Actions:

Site Deployment

Description

You choose your photo and the app shows you your fruits resemblance. Behind the scenes, the app uses a deep learning classifier model trained with Tensorflow on 27 fruits types from the Fruits Images Dataset. This isn't the best use of an ML model and it isn't the best classifier either but it fits perfectly for this use-case. I've focused more on the CI/CD aspect of the project using Github Actions than the ML process.

Link to Source Code

https://github.com/FadyGrAb/fruit-origins

Permissive License

Apache License 2.0

Background (What made you decide to build this particular app? What inspired you?)

I've used Tensorflow Python API before but never the Tensorflow.js API. I thought it would be a great opportunity to do basic MLOps with Github Actions and use the Tensorflow.js API in a React app. And I thought that it's a fun app to share with friends ๐Ÿ˜ƒ.

How I built it (How did you utilize GitHub Actions or GitHub Codespaces? Did you learn something new along the way? Pick up a new skill?)

App Architecture:

App arch
This app has 3 main components:

  1. The frontend: A single page app built with React.js and hosted as a static website on Amazon S3.
  2. The backend model: An AWS Lambda container function that serves a Tensorflow deep learning model via Amazon API Gateway.
  3. Github Actions: The CI/CD part of the project that has 2 workflows:
    • Deploy model: Tests the model and if passes, deploys the container lambda
    • Deploy Site: Builds the sites and uploads it to S3.

Model Life Cycle Management:

The app uses a very basic CNN (Convolutional Neural Network) that is built using Tensorflow. The initial training takes place in a training jupyter notebook. This part is usually done by a data scientist. After getting a satisfactory model, the notebook is converted to a Python module that uses the same code in the notebook. This part is currently manual but can be automated.

The ModelUtils utility CLI tool which is written in Python handles the model lifecycle as it performs the following:

  • Training: Using the previously mentioned Python module to train a model.
  • Promoting: Select a trained model for deployment (manually).
  • Testing: Test the promoted model.
  • Clearing: Clear the selected (promoted) model.
  • Deploying: Deploy the model (converts it to Tensorflow.js graph model).

Model Lifecycle

Model deployment to AWS container Lambda:

A push to the "deploy_model" branch will trigger the Test model and deploy Github Actions workflow and it will use the ModelUtils utility CLI tool to do the Testing and Deployment. If the Model test passes, the workflow will build the Docker image for the container Lambda, push it to the ECR repo and update the lambda code to use the latest image in the ECR repo.

Model

Site deployment to Amazon S3:

A push to the "deploy_site" branch will trigger the Deploy static website Github Actions workflow which will build the static assets and upload them to S3 static website hosting bucket.

Site

AWS services architecture:

I didn't use "production-ready" architecture for AWS as I was focusing on the use of Github Actions. Instead, I've adopted a relatively simple architecture. I've used 4 main AWS services:

  • AWS Lambda: To serve the model. I've used container lambda to overcome the 50MB layer upload size and the 250MB extracted size limitations as the model packages are bigger than those limits.
  • S3: To host the static website.
  • API Gateway: To call the Lambda using an API.
  • Elastic Container Registry: To store the model image for Lambda.

The S3 URL is HTTP not HTTPS which can be fixed by serving the website through CloudFront.

AWS

New things that I've learnt:

  • First time building a React.js app!
  • How to build and run an AWS container Lambda.
  • How to build Github Actions Workflows. I didn't use them before.

Additional Resources/Info

I've used the main Tensorflow tutorial on CNN image classification as a base for the model used in the app.

Top comments (6)

Collapse
 
andrewbrown profile image
Andrew Brown ๐Ÿ‡จ๐Ÿ‡ฆ

Inference via a Github Actions? Very cool

Collapse
 
fadygrab profile image
Fady GA ๐Ÿ˜Ž

Learned from the best ๐Ÿ˜Ž

Collapse
 
andrewbrown profile image
Andrew Brown ๐Ÿ‡จ๐Ÿ‡ฆ

In Week-X I attempted and gave up on Github Actions lol.
Who's learning from who? lol

Thread Thread
 
fadygrab profile image
Fady GA ๐Ÿ˜Ž

๐Ÿ˜‚๐Ÿ˜‚๐Ÿ˜‚

Image description

Collapse
 
nandorholozsnyak profile image
Nรกndor Holozsnyรกk

This is one of the best submission I saw :D I just can not stop laughing :D Good job!

Collapse
 
fadygrab profile image
Fady GA ๐Ÿ˜Ž

Thanks โ˜บ๏ธ. It was really hard to convince those bananas to come on board ๐Ÿ˜‚