The TimeSeries and IoT data share most of the characteristics except a few like the timestamp attribute. The time-series data arrival is predefined but IoT events can be in a random window. The IoT event analytics can be done either with IoT analytics or with Kafka but both have certain data processing limitations. The solution I have considered for this article has a mix of Kafka Streaming and IoT Analytics. The integration of AWS MSK with Kinesis Data Analytics or AWS Glue will be covered in detail in another article.
Kafka Protocol was built on top of TCP/IP whereas standard IoT devices support MQTT connections. MQTT was built considering bad network and communication. AWS IoT Core Device Gateway supports both HTTP and MQTT protocol, provides bi-directional communication with IoT devices and can filter and route data across the enterprise applications. It enables device registration in a simple manner thus accelerating IoT application development.
Apache Kafka is distributed messaging framework that provides fast, scalable and throughput ingestions. Kafka can replay IoT events, provides long-term storage, acts as a buffer when there is high-velocity data, provide easy integration with other enterprise applications. Real-time device monitoring is not possible in IoT Analytics whereas Kafka streaming along with an event processing framework can trigger actions for the anomalies in real-time. No downtime during Kafka cluster upgrade and clusters can be provisioned in 15 mins. Another main advantage is no charge for data replication traffic across AZ and this is a key factor when compared with expensive highly available self-hosted Kafka clusters.
AWS IoT Analytics can enrich the IoT data, perform ad-hoc analysis and build dashboards using QuickSight. It is a simple and serverless way to do data prep, clean and feature engineering and can be integrated with notebooks, AWS SageMaker to build machine learning models. Custom analysis code packaged in a container can also be executed on AWS IoT Analytics for use cases like understanding the performance of devices, predicting device failures, etc.
Timestream can easily store and analyze trillions of events per day. The data retention can be controlled based on the analytics need. It has built-in time-series analytics functions, helping you identify trends and patterns in your data in near real-time. Timestream can provide 1000x faster query performance along with 1/10th the cost of relational databases. When the same record is received for a timestamp, timestream can deduplicate it which is a common problem with streaming events. When there is a need for high volume of ingestion, events need to be written into canonical datastore like kinesis firehose and then it should be written to S3 for long-term storage but TimeStream can be used as serving db.
The AWS Lambda can process data from an event source like Apache Kafka. The lambda is very cheap and its memory can be scaled from 128MB to 10240MB. Also, the processing timeout can be set for Lambda functions. The IoT device's payload will be lesser in size and real-time device control operations based on incoming payload value can be easily done with AWS Lambda rather than using serverless services like AWS Glue or AWS Kinesis Data Analytics. Lambda can be triggered by an event source like AWS MSK or it can be scheduled in AWS.
The IoT device registration and IoT message simulation during development are critical tasks in IoT development and there is a need for the data engineering team to simulate the IoT events using various SaaS providers like MQTTLab, Cumulocity IoT etc.
The device registration and IoT Event data simulation for this article were done through AWS IoT simulator. The simulator provides device type registrations, controlling the number of devices to simulate the data and to define payload structure for each device. The Automotive telemetry data was considered and simulated as shown in the steps below.
Add Device Type
Create a simulation stack in an AWS region and add custom devices. The automotive telemetry payload attributes are inbuilt and can’t be changed.
Simulate IoT Data
Automotive Telemetry data is quite comprehensive and the simulator we considered here can publish messages on three topics
Telemetry Topic Payload
"timestamp": "2018-02-15 08:20:18.000000000"
Trip Topic Payload
Diagnostics Topic Payload
After completing the exercises, stop the simulation.
IoT Analytics setup is a simple process and it can quickly create channel, pipeline and datastore etc. in a single click as shown below.
The above setup was created to select all telemetry payloads from the subscribed topic.
The pipeline can be used to filter any unwanted attributes from the payload before creating a datastore.
S3 is used as an underlying data store and the data format can be parquet or JSON.
Once the simulation is started, the ingestion process can be monitored in the IoT Analytics window.
The dataset can be created and scheduled to be refreshed from the datastore. The dataset can be refreshed from the last import timestamp using the delta time option.
The IoT Analytics dataset can be used in aws sagemaker notebooks and can be used as a data source for QuickSight.
Create a notebook and open the notebook using the sagemaker instance.
Import dataset and analyze the data for building machine learning models for predictive maintenance of devices.
A basic working version of SageMaker code is kept in Github. The feature engineering, model training and inference pipeline for near real-time data will be covered in a separate article in a detailed manner.
Pricing of timestream is 1 million writes of 1KB size = $0.50. Avg size of telemetry data is 145 bytes. Avg size of aggregated trip data is 800 bytes. By merging all the dimensions like speed, fuel_level etc at the VIN level, we can reduce the number of writes per batch and cost. A detailed explanation about payload structure and batching the writes have been explained in my previous article. Create a Rule and Action for Timestream if the data is aggregated at the source itself or process IoT events stored in MSK using the Kinesis Data Analytics for Flink or using a custom apache-spark application.
AWS MSK was recently introduced as one of the Actions for IoT Core. IoT Rule Action can act as a producer for Managed Streaming Kafka. Create a VPC destination and add the cluster details and authentication mechanisms of the Kafka cluster to receive the messages.
Create a lambda function to consume and process the data.
for partition in partitions:
consumer.seek_to_beginning(partition) #sample read. use offset
The diagnostic data is less in volume but needs immediate action. The lambda will consume, check for a problem code and issue STOP signal to the device by publishing a payload to MQTT topic meant for a shadow device.
response = IOTCLIENT.publish(topic=iottopic, qos=0, payload=json.dumps(msg))
The working version of the lambda function is kept in GitHub kafka_consumer.py
IoT data can be processed and stored by various AWS services based on the use cases. The serverless AWS services can aid the rapid development of large-scale complicated architecture in an easily scalable manner. The managed services like AWS MSK along with a choice of distributed processing frameworks like Apache Flink based Kinesis data analytics or Apache Spark based AWS Glue and AWS Lambda will help in building scalable highly available and cost-effective processing for IoT/TimeSeries events.