DEV Community

Timothy Spann.   πŸ‡ΊπŸ‡¦
Timothy Spann. πŸ‡ΊπŸ‡¦

Posted on • Originally published at datainmotion.dev on

Updating Machine Learning Models At The Edge With Apache NiFi and MiNiFi

Updating Machine Learning Models At The Edge With Apache NiFi and MiNiFi

Yes, we have bidirectional communication with MiNiFi agents from Apache NiFi via Site-to-Site (S2S) over HTTPS. This means I can push in anything I want to the agent, including commands, files and updates.

I can also transmit data to edge agents via MQTT, REST and Kafka amongst other options.

NiFi Ready To Send and Receive Messages From Other NiFi Nodes, Clusters and MiNiFi Agents

Our NiFi flow is consuming Kafka and MQTT Messages, as well as reading updated model files and generating integration test sensor data.

MiNiFi Agents Have Downloaded The Model and Anything Else We Send to It

It's Easy to Configure MQTT Message Consumption in CEM, we just need the broker (with port) and a topic to filter on if you wish.

To Listen For Files/Models You can easily add a REST End Point to Proxy in Data of Your Choice with or without SSL

Here's an example CURL script to test that REST API:

p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #000000} span.s1 {font-variant-ligatures: no-common-ligatures}

curl -d '{"key1":"value1", "key2":"value2"}' -H "Content-Type: application/json" -X POST http://ec2-3-85-54-189.compute-1.amazonaws.com:8899/upload

We can generate JSON IoT Style Data for Integration Tests with ease using GenerateFlowFile:

Let's grab updated models when they change from my Data Science server:

I can read Kafka messages and send them to MiNiFi agents as well.

So I pushed a TFLITE model, but ONNX, PMML, Docker or Pickle are all options.

Top comments (0)