loading...
Cover image for js-coroutines gives your code: data indexing and lookup functionality, in idle time

js-coroutines gives your code: data indexing and lookup functionality, in idle time

miketalbot profile image Mike Talbot 惻4 min read

Alt Text

Following user feedback I've added some key "lodash" like functions to the out of the box features of js-coroutines.

I've added keyByAsync groupByAsync, includesAsync, uniqueByAsync. Here are all of the "out of the box" functions now available. They all work asynchronously, spreading the load over multiple frames to ensure that your app stays interactive.

Function Use
appendAsync Appends an array to another one, modifying the destination
compressAsync Compresses a string using lz-string. All of the other lz-string methods are also available.
concatAsync Concatenates two arrays creating a new one
decompressAsync Decompresses a string compressed with lz-string
everyAsync Validates that every member of a collection passes a predicate
findAsync Finds an entry which passes a predicate function in a collection or null
findIndexAsync Finds the first index which passes a predicate
forEachAsync Calls a function for every element in a collection.
groupByAsync Creates an index object where each key contains an array of all matching values
includesAsync Returns true if an array includes a value
indexOfAsync Returns the first index of an item in a collection
keyByAsync Creates an index object where each key is the last item in a collection to generate the key
lastIndexOfAsync Returns the last index of an item in a collection
mapAsync Runs a mapping function against each element of an array and returns a new array with the results
parseAsync Parses JSON into an object or value
reduceAsync Runs a reduce operation on all elements of a collection and returns the result
someAsync Checks if some entries in a collection match a predicate
stringifyAsync Converts a JavaScript object/value into JSON
uniqueByAsync Creates an array of unique values. The value determining uniqueness is produced by calling a function with the array entry.

You can now execute code like this:

     const response = await fetch("some/url")
     const data = await parseAsync(response.text())
     const index = await keyByAsync(data, v=>v.id)
     const groups = await groupByAsync(data, v=>v.category)

Of course, you can also write your own generator functions to split up any kind of processing you might need to do - and all of these functions work with the new pipe() to create functional pipelines that don't hog the main thread.


     const process = pipe(
       decompressAsync,
       parseAsync,
       keyByAsync.with(v=>v.id)
     )

Another new feature is support for "collections" where we can use objects with key value pairs as well as arrays with all of the key functions that make sense (in the table above these are shown as handling 'collection' parameters).

Posted on by:

Discussion

markdown guide
 

Great collection! Thanks!

mean time, for the "keyBy" I played around with this.


const myKeyBy = (arr, key) => reduce(arr, yielding((a, c) => (c?.[key]) ? {...a, [c[key]] : c } : {...a} ,{}));
 

That should certainly work! I'd just say inside this one function I'd avoid immutability - because nothing has access to the mutable object and it will allocate a lot less memory if you modify one thing rather than make a new one on each loop.

 

Certainly, and perhaps is significant on large data..
I put up something like this..:

  function* xKeyBy(collection, fn) {
    let result = {}
    yield* forEach(collection, function* (value, key) {
       let newKey = fn(value);
       yield;
       result[newKey] = value;
       yield;
    })
    return result;
  }

  //not blocking UI
  const dispatchKeyBy = async (dt, key, action)=>{

    let kby  = await run(function* () {
            return yield* xKeyBy(dt, x=>x[key]);
     });

    dispatch({type: action.type , payload: kby});

  }

and call it from useEffect from inside a hook...

For some reason I could not use the keyBy provided "out of the box". I think is a npm issue on my side

Odd on the "out of the box" all my tests seem to be ok. You probably don't want to yield that much. I'd be saying (if collection is an array) that you should yield every 32 or something using

    if((key & 31) === 0) yield

It's a balance between checking often enough and slowing it down a lot by checking too much.

In more recent versions you can also just call run() on the result of executing the generator, in your example this simplified the second routine to be:

const dispatchKeyBy = async (dt, key, action)=>{

    let kby  = await run(xKeyBy(dt, x=>x[key]));

    dispatch({type: action.type , payload: kby});

  }

npm version should be 2.3.62

It totally make sense!
Updated the npm as I was way behind... odd that I had to uninstall and then install again. Otherwise it was still version 1.1.36.
I hope that will not break the code now :)
Thanks for everything!

Right I added a breaking change a while ago, but it only affects a very small use case, so hopefully all ok. It was when I changed from using an external polyfill for requestIdleCallback to the new internal one.

Updated an everything seems to work like a charm!
Still It seems that you forgot some "debugger" at keyBy in array-utilities.js.
(Should I have written on GitHub on that? if so sorry)

Ouch lol. Ooops. Fixing that.

2.3.63 - removed the debugger

 

Is there any performance benefit using parseAsync(response.text()) than the usual response.json()?

 

This is really situational, because JSON.parse is really fast. With small workloads you will measure it in tens of microseconds. But it does block the event loop.

 

Agreed, parse is a lot faster than storage, but definitely can take more than 16.7ms with significant packets. You'd be trading performance for a smooth out, sometimes worth it with large packets. Should only be used if there is a glitch present I'd say.

If it was an option, I'd also consider using ndjson to split up the JSON objects into manageable chunks to get the best of both worlds.

 

Yes, even though response.json() looks like it's async, it isn't. So parsing JSON is a lot faster than stringifying it, but if it was huge then this would help. Part of the reason for writing js-c was that I needed an async json.