DEV Community

Cover image for Let's detect flowers! (with SageMaker and DeepLens)

Let's detect flowers! (with SageMaker and DeepLens)

As an AWS-Machine-Learning-Oriented parent, you decide to something fun with your 5 year old Emma. You decide to open your list of "Wanna try" projects, and you pick Flower Detection.

Getting the data

You: "Emma, wanna detect flowers with me?"
Emma: "Yes! Detective Emma!"
giggles
Emma: "Let's use AWS!"
You: "Of course we will. First, let's set up an S3 bucket."
Emma: "What's that?"
You: "An S3 bucket. It's a container of objects. The objects are files and the bucket is just a folder. It is different from a normal bucket in that you can store as many things as you want in it. Here, I'm going to put in the flowers."
Emma: "But we don't have flowers!"
You: "Oh yes we do. But first, let's organize the bucket."

after organizing the data and storing in the S3 bucket...

Labeling the data

Emma: "What do we do now?"
You: "Let's label the data."
Emma: "Why?"
You: "We want to use a model to detect these images. So it important that the model learns from the labels we give the model for it not to suffer."
Emma: "Oh."
You: "Yeah. We'll use Amazon Mechanical Turk to label the images. It's a group of people Amazon pays to label data. They're all over the world. Only, you can't see them."
Emma: "Let's do it!"

6 hours later, when the labels are ready...

Model Training

Emma: "Our model is so dumb!"
You: "Exactly. That's why we're going to teach it."
Emma: "What if it doesn't learn?"
You: "We keep teaching. Until it learns."
Emma: "Strenuous."
You: "It is. But I can bet that you'll be proud of the result. Detectives don't give up."
Emma: "YES! Never!"
giggling
Emma: "What nutcracker are we using now?"
You: "Amazon SageMaker."
Emma: "Wow."
You: "It is a service used to train models like every other tool, but it is super fast, easy and has many 'toys'."
Emma: (eyes light up) "Toys?"
You: "Yeah, You can play with them to build a really nice model. In formal terms, they're called libraries. Ready-made stuff made available to use."
Emma: "That makes it easier."
You: "What do you think we're going to do to keep teaching our model?"
Emma: "Maybe adjust its settings?"
You: "Exactly. We call them parameters. If we're not happy with the results, we can fix these parameters and then teach the model again. This time, we expect the result to be better than the first time we thought."
Emma: "Understandable. It's just like grounding Alex for partying on Friday till 2 a.m. and telling him to come home earlier."
You: "You're smarter than I am. And yes, that is very true."

after training the model
Emma: "Voila!"
You: "Now, let's see the results."
Emma: "How?"
You: Using a set of metrics. They let you know how performant your model is. One very common metric we'll use to evaluate the model is accuracy."

You both check the score and to find 96%. While you celebrate in victory, Emma frowns.

You: "You're not happy?"
Emma: "I wanted a 100% score!"
You: (laughs) "I forgot to explain. A good model is never perfect. Any model with an accuracy score of 100% is not a good one."
Emma: "Huh?"
You: "If our model had an accuracy of 100%, it means it'll say that it detected a flower when it sees a dog."
Emma: "How's that?"
You: "It means it learnt too much from the data. In this case, it cannot differentiate between flowers and non-flowers. Any image you ask it to detect, it tells you flower, or hibiscus or dandelion when it saw either a kitten or a phone charger or even you. The same rule applies to life: no one is perfect."
Emma: "This is really interesting."
You: "A score between 80 and 99% is usually the trend for good models."

She smiles.

Model Integration

Emma: "How do we link our model to a device?"
You: "We use SDKs, a set of tools used to integrate models. Based on what we're integrating it into, we need to set up the communication between the model and the environment it'll sit in. If it sits in an environment where it know nobody, it won't communicate. Like the first day you went to kindergarten and refused to play with the other kids. You behaved like that because you didn't know any Olivia or Chloe."
Emma: (smiles shyly)

Model Deployment

Emma: "Now, I really want to see how much my model has learnt from my instructions. So that I can ground him too if he makes any silly mistakes."
You: (chuckles) "Yes. Let's use DeepLens."
Emma: "Let me guess. It's another service that uses a device's camera to detect images."
You: "Exactly. Except that it is a deep learning video camera."
Emma: "What?"
You: "A deep learning model is more powerful and can do complex instructions. If a video camera is deep learning based, it means it can perform more complex tasks than our flower model."
Emma: "Reason why it's called DeepLens.
You: (nods) "Yes."

After deploying the model, Emma decides to test it. You both go out into the garden.

You: "Here, take the iPad. Let's analyse the flowers here. Start the video."

As she roams around, she sees live predictions on the video screen and she says the names of the flower she sees. You both share a high five.

Who am I?

Hii! I'm Delia Ayoko. I'm an AWS Community Builder in the Machine Learning Category and the first Cloud Club Captain in Cameroon, at the University of Bamenda. I am also a computer engineering student, data scientist, content creator, Wattpad author and mentor. I love building models especially using AWS. If you loved this article, please react to it so that I write another one next week. Thank you!

P.S: If you're looking forward to starting a career in machine learning, I have videos where I explain ML concepts on my Instagram. Feel free to check it out and follow me there for more content in your feed! :)

Top comments (0)