DEV Community

Cover image for Image Classification for Natural Disaster Damage with Amazon SageMaker Jumpstart
Wendy Wong for AWS Heroes

Posted on • Updated on

Image Classification for Natural Disaster Damage with Amazon SageMaker Jumpstart

Recovery in the aftermath of a natural disaster

Northern New South Wales experienced multiple floods in 2022, the town of Lismore did not have measures to predict or prevent natural disasters in the low-lying flood-prone region.

After the 2022 floods, many communities were displaced and homes were declared uninhabitable.

There are a few questions to consider in the aftermath:

  • How do government to help citizens find temporary accommodation? How do they assess photos of flood damage?

  • How can insurance companies assess photos of flood damage to homes and businesses for residential and commercial claims?

Machine learning can be used to explore patterns instead of relying on human intuition to classify images.

Learning Objectives

In this lesson you will learn how to:

  • Onboard Amazon SageMaker Domain
  • Set up Amazon SageMaker Studio
  • Import a Jupyter Notebook
  • Import your own dataset or use an existing dataset
  • Train and deploy an image classification model using Amazon SageMaker Jumpstart
  • Clean up resources

What is Amazon SageMaker Jumpstart?

You may train and deploy a machine learning model to solve your business problems with pre-trained and open-source models with a few clicks. You may also use SageMaker Python SDK for programmatic access.
SageMaker Jumpstart Industry notebooks can only be run on Amazon SageMaker Studio.

What are the benefits?

You may build, experiment and deploy machine models with:

Foundation models

You may access pre-trained foundation models for your business use case. This allows you to browse and choose from a large library of model providers, build, experiment, customize and deploy your Generative AI models.

Built-in algorithms and pre-trained models

You may access built-in algorithms and pre-trained models to help you achieve specific tasks such as vision, text, audio and documents. The models can be fine-tuned and also deployed quickly especially if you need to test and build from an idea or productionise your machine learning models.

Solution Templates

You may access pre-built ML solutions, browse the solution templates, select a template that includes for your use case and customize by using your own data and deployment is a click of a button.

  • Share machine learning artifacts

What are common use cases for Amazon SageMaker Jumpstart?

Amazon SageMaker Jumpstart suggests the following use cases:

  • Demand forecasting

  • Credit rating prediction

  • Fraud detection

  • Computer vision

  • Extract and analyze data from documents

  • Predictive maintenance

  • Churn prediction

  • Personalized recommendations

  • Reinforcement learning

  • Healthcare and life sciences

  • Financial pricing

  • Causal inference

Solution Architecture

This is my proposed diagram of using Amazon SageMaker Jumpstart with pre-trained ML solution templates that are initiated with Amazon CloudFormation.

Image class

Dataset

The Hurricane dataset was provided in the Financial Services Lab for damage classification.

Pre-requisites

  • You will need to have an existing AWS account or you may create one here.
  • You will need to login as an Administrative User and Group

Ensure that your IAM permissions for Administrative User allows you access to the following AWS services:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "sagemaker:*"
            ],
            "Resource": [
                "arn:aws:sagemaker:*:*:domain/*",
                "arn:aws:sagemaker:*:*:user-profile/*",
                "arn:aws:sagemaker:*:*:app/*",
                "arn:aws:sagemaker:*:*:flow-definition/*"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "iam:GetRole",
                "servicecatalog:*"
            ],
            "Resource": [
                "*"
            ]
        }
    ]
}
Enter fullscreen mode Exit fullscreen mode

Tutorial 1: Onboard to Amazon SageMaker Domain

Before you start using Amazon SageMaker you will also have to onboard SageMaker Domain.

Step 1: Navigate to Amazon SageMaker and select the AWS region that you prefer to use. I will select AWS region Asia Pacific ap-southeast-2.

Sagemkaer

Step 2: Choose Domains on the left-hand side menu.

domain

Step 3: Select Create domain.

create domain

Step 4: Select Quick Setup on the left-hand side menu.

quick

Step 5: Create a unique domain name.

domain name

Step 6: You may use the default name under User profile or you may create a unique name.

Iuser

You may select from the drop-down menu, the execution role SagemakerFullAccessrole and click Submit.

If you do not have this execution role, you may create one here.

Also, ensure that the box is checked for enable 'SageMaker Canvas permissions' and finally click Submit.

Step 7: Under VPC, you may choose an existing VPC or the default VPC. Under the default VPC, two subnets were selected and click Save and continue.

vpc

In a few minutes, the domain will be onboarded with the status ready with the User profile name.

user ready

Tutorial 2: Launch Amazon SageMaker Studio Using Amazon Management Console

Step 1: In the search bar, type the word 'SageMaker' to open the Amazon SageMaker Console. On the left-hand side menu navigate to Studio and you will see the new domain name created for the user profile, click Open Studio.

Iopenn

Amazon SageMaker Studio will take a few moments to launch.

launch

You will be directed to the Amazon SageMaker Studio homepage to confirm that you have successfully launched a SageMaker Studio Domain.

home studio

Tutorial 3: How to train and deploy an image classification model using Amazon SageMaker Jumpstart

In this tutorial, Amazon SageMaker Jumpstart will be used to train an image classification model on the Hurricane Harvey 2017 dataset.

Step 1: The dataset and Jupyter notebooks will be imported into Amazon SageMaker Studio by:

  • Selecting File -> New -> Terminal

import

The terminal environment will launch as shown below:

envi

Step 2: Download the Hurricane Harvey 2017 dataset to the Amazon SageMaker Notebook by copying this code into the Terminal

mkdir lcnc
cd lcnc
sudo yum install -y unzip
curl 'https://static.us-east-1.prod.workshops.aws/public/40de25f9-f9de-4fba-8871-0bf4761d175e/static/resources/finserv/vision.zip' --output vision.zip
unzip vision.zip

Enter fullscreen mode Exit fullscreen mode

The image files will take a few seconds to be imported.

data import

Step 3: Open the Data pane on the left-handside menu click to view the 'lcnc' folder.

Ilcnc

Step 4: Explore the data and copy the data from the local SageMaker folder to S3 and navigate to the lcnc/vision/ folder.

double click

Step 5: Double-click on the jupyter notebook 'explore-data.ipynb' and click Select.

Ijup

This will initiate the kernel.

sart kernn

Step 6: Run and execute all the cells in the jupyter notebook by pressing on your keyboard Ctrl + Enter.

execute

The files in the local folder have been successfully uploaded into your Amazon S3 bucket.

uploads3

You may also check the that the image files have been uploaded from local folder and into your Amazon S3 bucket 'damage-clf' folder.

file

damageon

Step 7: Click on the 'home' icon to navigate to the homepage and scroll down to the bottom to SageMaker Jumpstart and click 'Models, notebooks, solutions' to view the 'Model zoo' which includes pre-trained models, example notebooks and pre-built solutions.

model zoo

Step 8: Scroll down until you reach the section 'Explore all image classification models (162)' and double-click this hyperlink.

Image class

Step 9: Select the model Resnet 50 and click View model.

Ires

Step 10: From this pre-trained 'Resnet 50' model, you may click deploy to deploy the pre-trained model (ImageNet) as a SageMaker real-time endpoint to get inference.

deplpy

The deployment process will take a few seconds to complete.

process

The model endpoint is in service and ready to make inference.

make inference

Tutorial 4: Fine tune the model and bring in custom data

Step 1: To fine-tune the model on custom hurricane damage images, choose the S3 location of the images we just uploaded using the 'explore-data notebook' and specify the instance type we want to use for training as shown below.

train

Step 2: Select the instance type 'ml.c5.2xlarge'.

deplpy comfig

Step 3: Provide a model name e.g. hurricane-damage-abc

Step 4: Custom output S3 bucket

Select Default output S3 bucket.

custom

Copy the S3 URI of the training dataset.

copy uri

Step 5: Update the hyper parameters with the configuration below and select Train.

hypern

Training the model will take a few moments.

training

The model was successfully trained using custom data uploaded into Amazon S3.

trained model

Note: You may also inspect the training model artifacts saved in the Amazon S3 bucket.

training artifacts

Step 5: To deploy the fine-tuned model, configure the settings such as specifying the S3 URI path to save the model artifact.

deplpy custom

test modeln

Step 6: Under Security settings, select 'Find VPC', choose three subnets and also the security group and click Deploy.

After a few minutes, the endpoint is ready with the status'in service'.

endpointcreated

Step 7: On the left-handside pane, double-click on the jupyter notebook 'make-predictions.ipynb'.

And execute the cells in the notebook to make predictions.

doublr

Important note: On cell 5, be sure to replace default endpoint name from screenshot below and enter the endpoint name for your deployed model from Step 6.

enspoint

Step 8: Make predictions from the test data 'damaged'.

When making predictions on new data (i.e. photos), the probability is that property is 'damaged'.

test data

Step 9: Make predictions from the test data 'not-damaged'.

When making predictions on new data (i.e. photos), the probability is that property is 'not damaged'.

not

Clean Up Resources

Once you have successfully deployed the model you must delete the endpoint to avoid surprise end of month bills.

Step 1: Delete the endpoints by navigating to Deployments-> endpoints

delete

Step 2: Click on the hyperlinks to delete the model endpoint.

one by one

delete endpoint

endpoint deeleetd

Conclusion

In this lesson, you have learnt how to set up Amazon SageMaker Studio for your machine learning project and also deploy an image classification model using pre-trained models in Amazon SageMaker Jumpstart. Keep on building and exploring in the 'Model Zoo' of Amazon SageMaker Jumpstart.

Until the next lesson, happy learning! 😀

References

Resources

Next Lesson

The next few lessons will delve into a mix of classic machine learning modelling techniques as well as AI.

Last week: AWS re:Inforce 2023 on 13-14 June

You may watch on Youtube the keynote from CJ Moses, Chief Information Security Officer (CISO), AWS last week. You may also watch the leadership sessions, keynotes and breakout sessions from AWS re:Inforce 2023 at this link.

Coming soon: AWS re:Invent 2023 conference

You may register now for AWS re:Invent 2023 conference on November 27 to December 1 2023 in Las Vegas.

You may watch the AWS re:Invent 2022 keynote from Amazon CEO Adam Selipsky on Youtube on-demand.

Top comments (1)

Collapse
 
azurabennett profile image
Azura Bennett

Incorporating water restoration, this comprehensive guide simplifies disaster recovery using SageMaker for image classification in flood-affected areas.