DEV Community

Cover image for Watch the birds on your backyard with a Raspberry Pi, Edge Impulse and balena
Marc Pous
Marc Pous

Posted on • Updated on

Watch the birds on your backyard with a Raspberry Pi, Edge Impulse and balena

I love birds, but to be honest with you I don't have any clue about birds.

My parents installed an Amazon camera in their backyard and we started to get video streams of birds going to a bird feeder, that was awesome. However I couldn't tell my children what type of bird was on the camera. Then at the childrens' school we installed a nest for birds with sensors but we couldn't get any image when a bird was detected. And by serendipity, the balena Ambassador Mithun Das was working on a Bird Watcher project.

In this post I would like to share how you can build a Bird Watcher at home. Once you have it you will be able to watch birds eating on your bird feeder. Plus detect with a Machine Learning model running on Edge Impulse the type of the bird. Everything runs on a Raspberry Pi using balena and a camera.

Image description

Some of the goals of building a Bird Watcher are related to learning about birds around us plus experimenting with different materials to build the bird feeder. But lately as well to introduce IoT and AI to children. Cameras are not just cameras that take pictures anymore.

Requirements

Hardware

  • Raspberry Pi 3/4 or balenaFin
  • SD card in case of the RPi 3/4
  • Power supply and (optionally) ethernet cable
  • Pi Camera or USB Camera

Software

Deploy the fleet

One-click deploy using Balena Deploy

Running this project is as simple as deploying it to a balenaCloud application. You can do it in just one click by using the button below:

Follow instructions, click Add a Device and flash an SD card with that OS image dowloaded from balenaCloud. Enjoy the magic 🌟Over-The-Air🌟!

Join the balenaHub fleet

Keep it even more simple and join the BirdWatcher fleet on balenaHub.

Image description

Train your machine learning model to detect birds

First thing needed is to have your own Birds Machine Learning model made with Edge Impulse.

My recommendation is to start with Mithun’s ML model here. Clone it on your Edge Impulse free account.

Image description

If you prefer to start from scratch. Just create an Edge Impulse project and choose Images. Then select that you would like to Classify multiple objects (object detection). With this you will be able to detect the location of multiple objects in an image. For example, to detect many birds eating at the bird feeder together. Have in mind that the object detection is more compute intensive than image classification method, and it’s only available for Linux-based devices like the Raspberry Pi or Jetson Nano. Remember, on this project we are going to use the Raspberry Pi running on balena.

Image description

Data acquisition

Now it’s time to create the Training Data and the Test Data, acquiring data from the sensors. In our case, our sensor will be a camera. My recommendation is to upload pictures from birds that you know that live in your region.

Image description

In the future the best will be to train your model with pictures from the Bird Watcher device with your background, and label them on Edge Impulse.

Remember to add the correct labels to all the birds. From my experience to get a good Machine Learning model from pictures you will need dozens of pictures for each type of bird, so be patient.

Impulse Design

Next step is to create the Impulse. An impulse takes raw data (a picture in our case) and uses signal processing to extract features and then uses a learning block to classify the new data. Once the impulse is saved it’s time to check the raw data of the sample and Generate features.

Image description

When you click “Generate features” you will see the feature explorer diagram with all the samples represented on a X-Y-Z axis. This will enable you to see if the samples from the model are properly separated on the system and that means that the ML model will properly detect the birds.

Image description

Finally it’s time to click Object detection and train your Neural Network (MobileNetV2). Once you have fine-tined the training settings (number of training cycles, learning rate and validation set size), click Start Training. As a training output you will get the last training performance with a precision score and more information about the model generated.

Deploy with balena

Now you are ready to deploy your device with balena. Go to the BirdWatcher Github repository and click Deploy with balena.

Image description

If you don’t have a balenaCloud account, create a free account (you will be able to connect up to 10 devices for free) and create a fleet. The BirdWatcher project will deploy the latest release of the source code on your fleet.

Click Add a new device, configure the balenaOS (add WiFi credentials) and download it or flash directly the SD card of your Raspberry Pi using balenaEtcher.

Image description

Introduce the flashed SD card into the Raspberry Pi and power it up. You should be able to see your device on your fleet getting online and all the BirdWatcher services getting released.

Image description

Open your birdwatcher.local website

Once you see the device online and ready on balenaCloud. Check that your laptop / mobile device is connected to the same WiFi as your Bird Watcher. Type into your browser http://birdwatcher.local.

Image description

On the local UI, you will be able to see the stream from the camera on the BirdWatcher.

Watched birds

Click on Snaps to see all the "watched" birds plus Birds detected that are unknown as we are using the camera of the Raspberry Pi as a motion sensor.

Image description

From Snaps you will be able to send pictures to Edge Impulse to retrain the ML model. The picture sent to Edge Impulse is going to be a picture without any square on it.

Settings

Go to Settings and start defining your settings as your Edge Impulse keys.

Image description

On the other hand, enable the motion flag if you want to watch the unknown birds, record the images to feed your Edge Impulse ML model to retrain it. Add Y to the ENABLE_MOTION variable.

Finally, if you would like to send the pictures and results from the ML model over Telegram, turn on the telegram notifications. Set the ENABLE_TG to Y and add the keys of your Telegram bot.

Image description

Print your Bird Feeder

Image description

Download STL files from here bird_watcher_3D_print.zip

Image description

Attributions

Thank you to Mithun Das for developing this amazing application and Edge Impulse to provide support on the balena Linux integration.

Top comments (0)