Sinx for dumb data aggregation

artemis profile image Diane Originally published at artemix.org ・3 min read

Originally published here.

A lot of linux tools display data to the user.

Task bars, notifications, dashboards, are all commonly used tools.

Something I always found lacking was cron logs storing.

Either you had to play with the file system (which isn't something you usually want to do), or build some complex scripts to interact with proper storage solutions.

I did build a daemon in python, whose goal was to provide a scheduler that would periodically run tasks, configured through a YAML script.

Still, that requires lots of setup, including python, a daemon, dependencies, and making every script a python one.

Most of my scripts were written in bash, so executing them and proper process control is annoying to handle.

I decided to re-code sinking following the "unix" way: simple tools for a single job each.

Introducing sinx

sinx is a set of three tools:

  • a collector
  • a retriever
  • a HTTP API daemon

Additionally, while I'm writing this article, I have a few statusbar / dashboard plugin integrations, and a few new features, such as a CGI handler or a prometheus exporter, planned for the future.

The goal is to simplify both collection and extraction through two paths.

Sinx can collect data through its collector binary (which requires shell execution), but you may extract data from it through either its HTTP daemon, or through the retriever CLI tool.

All it needs is a Redis instance it can use to store data in.

The data store is based on Redis, the three tools are built around it

The collector

The collector is built in a way that makes it easy to integrate in shell scripts or cron jobs.

The base usage is to run sinx-collect <your key>, and to pipe content on the process' stdin pipe.

When a termination byte is reached, the data is flushed and stored on Redis.

The CLI retriever

As for the collector, the CLI is made to provide a simple access to sinx's stored values.

To get your value from the store, use sinx get <your key>, and you'll get it on the process' stdout pipe.

What if you try to access something that doesn't exist?

Well, you get an exit code of -1 and a small prompt showing that you've tried to get unknown data.

$ sinx get nonexistent
sinx: nonexistent: no value stored for this key
$ echo $?

And if you want to see every defined key, you can use the sinx keys command!

The sinx keys command outputs its keys by starting with a line composed of total <key count>, then from this point forward, every line's first value is the key (no other value should be expected on each line).


The sinx-http tool also provides a HTTP daemon made to allow data extraction in the JSON format.

Covering it would be out of scope for this showcase article, but you should find everything needed on the daemon's documentation page.

Example use cases

Two common use cases I use sinx for are:

  • API "availability" status check (if the API is up or down) right on my computer
  • "New email" (if I received some emails)

Both are pretty slow, and annoying to do right into a dashboard scripting tool, such as conky, so deferring the work to a cron job makes my data life easier.

Posted on by:

artemis profile



French developer and artist. All my maintained projects are on https://git.sr.ht/~artemis


Editor guide