First, I created an async generator that read the input file stream chunk by chunk and yield each number line by line.
asyncfunction*streamToFrequencies(stream){letprevious="";forawait(constchunkofstream){previous+=chunk;leteolIndex;while((eolIndex=previous.indexOf("\n"))>=0){// exclude the EOLconstnumber=previous.slice(0,eolIndex);yieldparseInt(number);previous=previous.slice(eolIndex+1);}}if(previous.length>0){yieldparseInt(previous);}}
This approach made part 2 ugly because I had to find a way to re-open the file stream and loop back through the inputs. Using a set over an array really helped with performance.
constcalibrate=asyncstream=>{letcurrentFrequency=0;constfrequenciesFound=newSet([0]);while(true){// clone stream and put in cold storage// in case we need to re-read inputs.letfrozenStream=clone(stream);forawait(constfrequencyofstreamToFrequencies(stream)){currentFrequency+=frequency;if(frequenciesFound.has(currentFrequency)){returncurrentFrequency;}frequenciesFound.add(currentFrequency);}stream=frozenStream;}};
I also did mine in JS but I decided to use the readline interface to read each line individually and spend less memory by not loading the entire file in the memory at once.
I haven't trying using for await (... of ...) with the readline interface. Maybe I'll try that next. If anyone would like to try it please post it here.
Here are my solutions:
My solution in JavaScript / Node 11, using the readline interface:
const{readFile}=require('./readLines');(async()=>{constlines=awaitreadFile('01-input.txt');constfrequency=lines.reduce((frequency,line)=>frequency+Number(line),0);console.log(`The final frequency is ${frequency}`);})();
01b.js
const{readFile}=require('./readLines');(async()=>{constlines=awaitreadFile('01-input.txt');constfrequencySet=newSet();letfrequency=0;letdidAFrequencyReachTwice=false;while(!didAFrequencyReachTwice){for(letlineoflines){frequency+=Number(line);if(frequencySet.has(frequency)){didAFrequencyReachTwice=true;break;}else{frequencySet.add(frequency);}}}console.log(`The first frequency reached twice is ${frequency}`);})();
Once it is released, my streamToFrequencies generator won't be needed.
Also, createReadStream only reads the file in 256 byte chunks at a time (or whatever you set the highwatermark to be, it does not read the entire file into memory. readFile would, however.
Node.js
First, I created an async generator that read the input file stream chunk by chunk and
yield
each number line by line.Then part 1 was pretty straight forward:
This approach made part 2 ugly because I had to find a way to re-open the file stream and loop back through the inputs. Using a set over an array really helped with performance.
Putting it all together:
Full code: github.com/MattMorgis/Advent-Of-Co...
I also did mine in JS but I decided to use the
readline
interface to read each line individually and spend less memory by not loading the entire file in the memory at once.I haven't trying using
for await (... of ...)
with thereadline
interface. Maybe I'll try that next. If anyone would like to try it please post it here.Here are my solutions:
My solution in JavaScript / Node 11, using the
readline
interface:readLines.js
01a.js
01b.js
Readline doesn't work with async iterators and
for await
yet, but it just landed in 11.x staging.Once it is released, my
streamToFrequencies
generator won't be needed.Also,
createReadStream
only reads the file in 256 byte chunks at a time (or whatever you set thehighwatermark
to be, it does not read the entire file into memory.readFile
would, however.To read more about async iterators and generators and the
for await
syntax, check out 2ality.com/2018/04/async-iter-node...Thanks a bunch @mattmorgis !