Ever needed to server big JSON files over the network (like 100+ MB files).
The efficient way we can handle this problem is by converting the JSON...
For further actions, you may consider blocking this person and/or reporting abuse
Most http transfers use gzip compression by default. Some will use brotli which might be even better for these sorts of files.
Agreed, but It depends on the use case In this case I'm not using any server just have to download dataset files from s3 bucket and then render or GMap or in MapBox.
I was thinking of making an express server but later dropped that Idea, not optimal in case of datasets.
Edit - CDN to S3 bucket
I'm missing something. A CDN (content distribution network) is a server. Downloading from a CDN using http will compress the file.
To my best knowledge a CDN is not a server. CDN Wiki
I have updated my comment to not confuse the readers.
Interesting! I guess encoding and decoding a large file is quite efficient and takes less time than network transfer? Curious if you have a rough estimate of the time difference between the two approaches.
@tlylt I have updated the article for time logs.
sure I will be updating the article, probably today
I'm curious what you do with such a large file on the client. Do you keep it all in the memory or do you put it into, e.g. IndexedDB? In case of the latter option, it might be more efficient to stream and continuously save the data in NDJSON or similar format, but again, depends on the shape of your data and how you want to use it.
I have to fetch these files and do some filters after that giving this data to deck.gl to render on Google Map or in Map Box.
I keep this data in redux store for now, but I think I should use any kind of DB on client side.
For Now I do not know how to stream data in deck.gl but would love to know about it. If you have any resources please share.
I'm using React.js with no backend.
Thanks Abhay
You may be interested to dig into :
npmjs.com/package/uraniumjs
npmjs.com/package/superjsonatural
So it achieves more than Gzip when it comes to compression ratio and works faster than CBOR while being close to JSON BUT still supporting typed arrays!
Not useful in this case. Although there are fflate which is much better then pako also. I put the link in footer you can check it out.