DEV Community

Discussion on: 60fps Javascript while you stringify, parse, process, compress and filter 100Mbs of data

Collapse
 
jsoneaday profile image
jsoneaday

Hi Mike awesome work! I'm wondering if this can be applied to the backend. What I mean is if you're able to process arrays of a million records could this replace the event loop in node?
In node the event loop is not async, so a long running task can block. Would your generator technique allow more scaling by having http calls overlap and do work during main thread idle time?

Collapse
 
miketalbot profile image
Mike Talbot ⭐ • Edited

You know I think it actually would yes with a modified form to only run for a particular period of time.

At the moment the polyfill does a requestAnimationFrame and then uses setTimeout to measure the amount of time remaining. On node I guess we'd just say "you are allowed 20ms" then skip to the next main loop.

I'll get to doing that.

Collapse
 
jsoneaday profile image
jsoneaday

Terrific Mike! Imagine you can get node to scale to 500 thousand or more? Currently node scaling is pretty poor. On techempower raw node is only 176190 concurrent requests sec (128th place).
techempower.com/benchmarks/#sectio...
I'm also a js dev so if you need help let me know.

Thread Thread
 
miketalbot profile image
Mike Talbot ⭐

And I'd be delighted to get any help anyone would like to give. I'm sure that there are some other very useful functions that could be added!