DEV Community

Discussion on: Storing JSON in Postgres using Node.js

Collapse
 
n1ru4l profile image
Laurin Quast • Edited

I had some bad experiences with storing very large JSONB structures an then selecting parts of it in a SQLQuery (SELECT "field"->'key' FROM "myTable"). It got much better performance by storing the data that is part of the JSON blob in a separate column. I must say that some of those BLOBs are a few Megabytes as we store data that has been extracted by a ML classifier. Nevertheless, I am curious what data people usually store in JSON columns, for my use cases it is mainly mutli-dimensional data that would not make sense getting normalized.

Collapse
 
forbeslindesay profile image
Forbes Lindesay

I don't have any experience trying to store ML data in JSONB. Currently I can think of two places we use them at Threads Styling:

  1. We have a "MediaStore" service for storing and processing images and videos. This service is consumed by a range of different services internally, so we allow storing a small blob of metadata along with each image. This makes it really flexible for supporting lots of different scenarios.

  2. We have a service that synchronises contacts between google contacts and an internal database (for our employees). We store the data we currently care about (name, phone numbers etc.) in normalised columns in postgres, but we also dump the entire JSON document for each contact into a JSONB column, which has proved useful for debugging, and would allow us to extract extra info if we needed by running a db migration, rather than needing to re-sync with google.