There is a case to collect transaction data in real time. And there is an api that return transaction data from transaction id. And there are millions of transaction ids. There is an api that can get last transacton id. And that number increases every second. Also, Aggregated information like daily active user or daily transaction should be retrieved. What would be the best approach to do this?
I tried to run the ETL code in infinite while loop and stored those raw data into PostgreSQL. But not sure how to get the statistics information like daily active users or number of transaction per day. May I create the aggregated table manually while inserting raw data to table? Or May I run the sql query to get statistics information? Note: This is the real-time data.
Top comments (0)