DEV Community

Simon Waight
Simon Waight

Posted on • Originally published at blog.siliconvalve.com on

Real-time air quality monitoring and alerting with Azure and PurpleAir – Part 1

Anyone who was living in Australia during the 2019/2020 summer can’t help but remember the massive bushfires we had, and the impact they had on air quality.

Probably the starkest way to illustrate how bad it was is this post from December 10, 2019. I also added a recent follow-up post to show what it normally looks like here.

I’ve had a weather station for the house for probably 10 years or so, and around the time of these fires I started looking into air quality sensor as a way to identify when we shouldn’t be spending time outside.

If I’m quite honest, needing to check outside air quality safety feels a little crazy, but I think it’s a wake-up call that many of us might have needed to realise that this issue affects everyone, not just certain heavily industrialised areas of the world.

On top of this, while the first photograph above shows it’s clearly unhealthy to head outside, it doesn’t need to get to that level of visibility before air quality is at dangerous enough levels for some people.

After a number of months looking I was unable to find a good quality consumer air sensor in stock. The increased demand due to the following year’s North American and European fires, along with pandemic-driven shortages meant I put this on the back-burner and only periodically checked for stock. One of my colleagues, Dave Glover, built his own hardware solution, but I wasn’t ready to go down that rabbit hole!!

During a recent browsing session I happened across PurpleAir. Their devices are less consumer-orientated than I would have liked, and I was ideally wanting something battery-operated, but I also liked that they listed their device details, including that it has two laser-based particle sensors (PMS5003) (coincidentally, the same Dave used in his DIY project).

Having a bit of a dig I could also see that they make data publicly available and that you can access it yourself as well. This is a win-win for me. Everyone gets the benefit of more readings and I get something I can build on top of.

Despite the pricing (and the shipping 😪) I decided to take the plunge…

After installation I went to check the PurpleAir map for my device and was happily surprised to find that the New South Wales Government’s Department of Planning, Industry and Environment (DPIE) is using these sensors – this one in Manly.

Manly Sensor Reading

Understanding Air Quality Index (AQI) and measurements

👨‍🔬 Non-scientist alert 👨‍🔬 I’m going to put this disclaimer here. I’m not an air quality expert. Please take what follows with a grain of salt and do your own reading!

The aforementioned DPIE has a really good summary of Air Quality.. if you want to go and have a read, I’d highly recommend it.

The PMS5003 sensors are particle sensors, so for us, the two air quality properties we can measure will be:

Measure Period Units GOOD FAIR
Particulate matter < 10 µm (PM10) 1 hr µg/m3 < 50 50-100
Particulate matter < 2.5 µm (PM2.5) 1 hr µg/m3 < 25 25-50

The PMS sensors send through data for much finer particles than PM2.5, but the two measures above will be key for our final solution. We'll use DPIE's Air Quality Categories (AQC) as well for our measure which means anything other than "GOOD" will impact sensitive people. I haven't put all categories here, but we'll alert on state change for each category (up and down).

Oh, look the PA-II support Azure IoT!

Well, yes, the registration screen includes Azure as a potential Data Processor, except.. good luck if you can get it to work… and it’s currently totally undocumented.

Configuring data processors for PA-II air sensor

Sadly I was unable to get it to work and the PurpleAir team is currently shifting support platforms, so I wasn’t able to find out how to get it working.

Which means…

CODE ALL THE THINGS!

The proposed solution

As with any self-respecting nerd I have various bits of tech to hand, with my Network Attached Storage (NAS) device from Synology being a really useful swiss army knife. The Synology NAS supports Docker containers and has been handy for hosting a bunch of things for me, so I thought it would make sense to re-use for this solution.

.NET 6 also recently shipped, and I wanted to have a play with the new Minimal API model that removes a lot of boilerplate code from your solutions. What a great scenario to use it for!

The diagram below shows the high level flow of the overall solution that will allow us to get our sensor data into Azure IoT Hub.

Proposed Solution Architecture

Time to code!

I chose to use Visual Studio Code to build my Gateway solution which is a standard ASP.NET Web API. This Gateway accepts HTTP POST requests from the PA-II sensor and parses the JSON payload which is then sent to an Azure IoT Hub using the IoT Device C# SDK.

You can find the final API solution on GitHub. You will see it includes a Dockerfile which is then used by the associated GitHub Action to build the solution and publish the resulting image to Docker Hub.

All the logic to handle the data from the sensor lives in the SensorReadingController.cs file which contains a single method to hand the HTTP POST request (view it on GitHub). I created a C# class to model the JSON payload from the sensor which also makes the code a bit easier to read!

Given I have a closed network with a strong security setup I probably could have left this Web API open to any caller.. but we’ve all seen horror stories of security breaches of all types happening from insecure endpoints, so I thought it best to at least perform some basic client validation (see the lines that perform the check) before allowing the request to be processed. I know this wouldn’t stop a motivated attacker, but hopefully it’d be enough in most cases!! Yes, I could go a lot further, probably right down in the guts of ASP.NET Authentication… but for my use case this will suffice.

Deploying the gateway

I’m using Docker Hub to host my Container Image as it is the default Container Repository that Synology’s Docker setup uses which makes it easy for me to pull the resulting Container Image to my NAS (it’s also free which is handy!)

On my NAS I open up the Docker application and search for my customer gateway Image on Docker Hub. Once found I can then select the Image and Download it.

Docker image search on Synology NAS

Once the image is downloaded I then need to launch on instance of it, so I switch over Image, select my downloaded Image and click Launch.

Launch Docker Image on Synology

In order for my API to run I need to configure some environment variables and to also define the TCP port I want to expose the Web API on. I do this by selecting Advanced Settings on the Launch dialog.

Advanced Settings on Launch dialog

Specify the port mapping..

Container Port mapping

and then set the four required environment variables as detailed in the readme on the GitHub repository.

Set environment variables for the container

Once configured you can now start the gateway Container and should see confirmation in the logs that it has started.

Next we need to update our PA-II device registration so that it calls our newly deployed API Gateway, so let’s go ahead and configure that on the PurpleAir registration site.

PurpleAir device registration with new gateway

Once the registration is saved we should see a call to the Gateway API every two minutes.

Container Logs

As a final check we can switch over to our Azure IoT Hub Overview tab and see that events are arriving as expected.

Azure IoT Hub Overview screen

So, what now?

At this stage we are now delivering filtered events from our local sensor to an Azure IoT Hub instance, but this is only a part of our overall solution.

The IoT Hub instance will hold our event data for up to 7 days before it expires. We could choose to route the events to an Azure Storage Account or other endpoint, but for the purpose of this blog series I am simply going to leave them sitting in IoT Hub until we are ready to build the next stage of our solution. This way I am avoiding incurring additional costs until I’m ready to develop and deploy the cloud processing and storage part of my solution.

Hopefully this has been an insightful post which shows you how you can quickly take locally generated data and push it into Azure for additional processing.

Until the next post! 😎

Top comments (0)