DEV Community

Cover image for What Does the Iterator Protocol Have To Do With`for...of`?
Klemen Slavič
Klemen Slavič

Posted on • Updated on

What Does the Iterator Protocol Have To Do With`for...of`?

Iterables have been around for a while now and supported on all platforms, and you may have been wondering what they are, how they work and how they can be useful.

This article explains the basics of iterables and the iterator protocol and shows how they can be used.

An Introduction to the Iterator Protocol

You may have used JavaScript's for...of statement to iterate over arrays or Object.entries(), but you may not have known that any object can provide this kind of iteration by implementing the iterator protocol.

An object can be iterated with for...of if it implements the iterator protocol by providing a method called Symbol.iterator. It takes no arguments and returns an iterator.

An iterator is an object that contains the next() method and returns an iterator result. The result object contains two properties:

  • done: a boolean value that tells the consumer whether the iterator has been depleted
  • value: the value produced by the iterator

Let's first implement a function that will create an iterator that counts up all natural numbers until max:

const naturalNumbers = (max: number) => {
  let n = 1;

  const next = () => {
    if (n > max) return { done: true };
    const value = n;
    return { done: false, value };

  return {
    [Symbol.iterator]() {
      return { next };
Enter fullscreen mode Exit fullscreen mode

To use this function without the for...of loop, let's use the iterator protocol to demonstrate what it does under the hood:

const sequence = naturalNumbers(10);

// calling the `@@iterator` method on the sequence returns an iterator
const iterator = sequence[Symbol.iterator]();

// we create an infinite loop...
while (true) {
  // ...that will keep reading the next value from the iterator...
  const { done, value } =;
  // ...until the iterator tells us it's done
  if (done) break;
  // else, we print out the value and repeat the loop
Enter fullscreen mode Exit fullscreen mode

This iterator protocol is what enables the for...of loop to create an iterator from the sequence and loop through each value. We can write the above with a very compact syntax:

for (const n of naturalNumbers(10))
Enter fullscreen mode Exit fullscreen mode

While it may look a bit daunting to implement the iterator protocol directly, there is a convenient way to express the same sequence using generator functions.

Creating Iterables using Generator Functions

A generator function can be created using the function* syntax. This enables the use of the yield keyword that pauses execution of the generator function and passes the value given to the yield keyword back to the caller. When the caller requests the next value, the generator function is resumed until it hits the next yield statement. If the generator function returns, the iterator protocol is sent a done: true value.

With this in mind, let's rewrite the sequence as a generator function:

function* naturalNumbers(max: number) {
  let n = 1;
  while (n <= max) {
    yield n;
Enter fullscreen mode Exit fullscreen mode

Calling this generator function returns an iterable value that can be consumed by the for...of loop just like the previous example.

To understand how the body of this function is executed, it's best to imagine that the function starts paused until the iterator's next() method is called. On the first call, n is set to 1 and we enter the loop for the first time. We yield the value 1 and wait until the next call.

On the second next() call, we resume the function and increment n, return to the top of the loop (since 2 < 10) and yield 2, which pauses the function again.

Up until the final (11th) call of the next() method, we increment n which now becomes 11, the while loop breaks and the function returns, marking the iterator as done.

Why Bother?

Since iterators only produce values when the next() method is called, we can generate huge sequences without having to waste memory creating arrays and storing values. Running new Array(10_000).fill(0) just to create 10k values is much more

A generator for all natural numbers is simply:

function* nat() {
  let i = 1;
  while (true) yield i++;
Enter fullscreen mode Exit fullscreen mode

NOTE: eventually, this generator will fail to produce new numbers above Number.MAX_SAFE_INTEGER 64-bit IEEE numbers do not have infinite precision. Use BigInt instead if you REALLY need to go that far.

If you try to iterate over this iterable, the for...of loop will never terminate:

// prepare for a very long wait
for (const n of nat())
Enter fullscreen mode Exit fullscreen mode

We could, of course, write a check inside the body of the for...of loop to terminate early, but we can use generator functions to manipulate other iterables.

Let's create a function that will take an iterable and return a new iterable that will at most iterate over the given number of items:

function* take<T>(iterable: Iterable<T>, count: number) {
  let i = 0;
  for (const item of iterable) {
    if (i == count) return;
    yield item;
Enter fullscreen mode Exit fullscreen mode

We can now take the first 10 natural numbers:

for (const n of take(nat(), 10))
Enter fullscreen mode Exit fullscreen mode

Practice Makes Perfect

To get a better feel for generator functions, here are a couple of examples to implement on your own. Post your solutions in the comments and get feedback! 😊

  • create a generator function that will repeat a value a given number of times: repeat("a", 3) => "a","a","a".
  • create a mapping generator function that yields transformed values from a given iterable: map(nat(), x => x * 2) => 2,4,6,8,...
  • create a generator function that takes an iterable and yields every n-th item: every(nat(), 3) => 1,4,7,10,...
  • create a generator function that takes multiple iterators and yields each iterator by packing their results into an array: zip(nat(), nat()) => [1,1],[2,2],[3,3],...

Up Next

A similar concept called Streams also enables consuming large blocks of data by processing small chunks at a time. Stay tuned for the next article delving into neat little feature.

Top comments (0)