DEV Community

Discussion on: Book Club: Eloquent JavaScript - Chapter 2

Collapse
 
kharouk profile image
Alex Kharouk

Wow, @peerreynders , thanks again for the lovely, in-depth reply.

Just to clarify some things, does that mean in this piece of code: if (i % 5 === 0) word += 'Buzz'; the expression is i % 5 === 0 as that returns a value whilst the whole piece of code is a statement? Is word += 'Buzz' a statement as well, with a side effect?

I am having difficulty correlating "nouns" with the capability of producing and transforming values.

I initially looked at expressions as values; 5 is 5, but with your comment on how expressions result from an evaluation, I see that expressions are far more than just "values". As you mentioned, they can also transform values or manipulate the flow. I suppose I did regard them as constant expressions.

I have never heard of PLOP or the value of values talk! Interesting. Would you then say the majority of code in JavaScript is place-oriented?

If new information replaces the old, you are doing place oriented programming.

It seems like we are constantly writing statements.

Adopting an expression-based coding style can help reduce the use of mutation - as long as the performance cost and memory consumption remains within acceptable limits

If we look at how you disassembled the FizzBuzz example to be more expression-based, it became less and less clear on what exactly was the code suppose to be doing. It may have been more expression-based, but it also loses its simplicity (as you mentioned), so I would add that as a limit alongside performance and memory consumption.

Collapse
 
peerreynders profile image
peerreynders • Edited

Is word += 'Buzz' a statement as well, with a side effect?

Assignments are expressions, the only thing that's a statement is the if statement. It's unfortunate but in JavaScript expression evaluation can have side effects.

Would you then say the majority of code in JavaScript is place-oriented?

Make no mistake, JavaScript is an imperative language. In fact it's way more statement-based than for example Rust. While also being an imperative language Rust embraced expressions to such an extent that it supports a lot of practices that are typical for functional languages while also supporting an imperative style of coding. But as a result of Brendan Eich being a Scheme fan when he designed JavaScript there is enough wiggle room to pursue a value-oriented style in JavaScript (JavaScript is a LISP). And from what I can judge that is the style that Marijn Haverbeke is leaning towards (though not with any dogma).

It may have been more expression-based, but it also loses its simplicity (as you mentioned)

I would argue that it actually gains simplicity but loses "easiness" ("simple" scales, "easy" does not).

Before I can dive into that particular discussion it may be useful to expose my thinking behind the various changes:


Again starting with:

function fizzBuzz(count) {
  for (let i = 1; i <= count; i++) {
    let word = '';
    if (i % 3 === 0) word += 'Fizz';
    if (i % 5 === 0) word += 'Buzz';
    console.log(word || i);
  }
}
Enter fullscreen mode Exit fullscreen mode

For my taste this is a bit of a lumpers solution - it feels more like a script than a function with focus - and given the context that may be OK.

However the core rules

... for multiples of three print “Fizz” instead of the number and for the multiples of five print “Buzz”. For numbers which are multiples of both three and five print “FizzBuzz”.

don't seem to get the boundary they deserve so

function fizzBuzzer(count) {
  for (let i = 1; i <= count; i += 1) console.log(fizzBuzz(i));
}

function fizzBuzz(n) {
  let word = '';
  if (n % 3 === 0) word += 'Fizz';
  if (n % 5 === 0) word += 'Buzz';
  return word || n.toString();
}
Enter fullscreen mode Exit fullscreen mode

Now "the rules" are extracted into fizzBuzz while fizzBuzzer orchestrates generating the necessary input values and outputs the result values. Note the n.toString() to ensure that fizzBuzz returns a string rather than a string | number (sum types).

Now if you're OK with local reassignment/mutation you can stop here.

Eliminating reassignment/mutation takes a bit more work

function fizzBuzzer(count) {
  for (let i = 1; i <= count; i += 1) console.log(fizzBuzz(i));
}

function fizzBuzz(n) {
  return buzz(fizz(n)) || n.toString();
}

function fizz(n) {
  return n % 3 === 0 ? ['Fizz', n] : ['', n];
}

function buzz(value) {
  const [word, n] = value;
  return n % 5 === 0 ? word + 'Buzz' : word;
}
Enter fullscreen mode Exit fullscreen mode

Functions that accept one single value compose better given that functions only return one value. To be able to simply write buzz(fizz(n)) fizz has to return the string and n simultaneously. Traditionally in JavaScript an object { word , n } would be used for that purpose but for just two values that's a bit verbose so a tuple is used to pass the necessary data to buzz.

Aside:
JavaScript only has arrays, however they can be used in different ways. When an array is used "as a list" it can hold zero to many elements but the elements are often expected to be all of the same type. When an array is used as a tuple it is expected to have a certain, exact length but can hold values of varying types though each position holds the expectation of a specific type. Here fuzz returns a [string,number] tuple (pair, couple).

There is a pipeline proposal under consideration. With that buzz(fizz(n)) could be rewritten as n |> fizz |> buzz - pipelining values through a chain of functions makes the order of operations more obvious.

At this point there is a certain similarity between fizz and buzz - they can be easily homogenized:

function fizzBuzzer(count) {
  for (let i = 1; i <= count; i += 1) console.log(fizzBuzz(i));
}

function fizzBuzz(n) {
  return buzz(fizz(['', n]))[0] || n.toString();
}

function fizz(value) {
  const [word, n] = value;
  return n % 3 === 0 ? [word + 'Fizz', n] : value;
}

function buzz(value) {
  const [word, n] = value;
  return n % 5 === 0 ? [word + 'Buzz', n] : value;
}
Enter fullscreen mode Exit fullscreen mode

This makes buzz(fizz(['', n]))[0] a bit more complicated. ['', n] has to be supplied as an initial value and at the end we extract the word with an array index of 0. At this point somebody may yell "repetition":

function fizzBuzzer(count) {
  for (let i = 1; i <= count; i += 1) console.log(fizzBuzz(i));
}

function fizzBuzz(n) {
  return buzz(fizz(['', n]))[0] || n.toString();
}

function makeTransform(divisor, fragment) {
  return function (value) {
    const [word, n] = value;
    return n % divisor === 0 ? [word + fragment, n] : value;
  };
}

const fizz = makeTransform(3, 'Fizz');
const buzz = makeTransform(5, 'Buzz');
Enter fullscreen mode Exit fullscreen mode

Pros: nice highlighting and separation of commonality and variability.

Commonality:

function makeTransform(divisor, fragment) {
  return function (value) {
    const [word, n] = value;
    return n % divisor === 0 ? [word + fragment, n] : value;
  };
}
Enter fullscreen mode Exit fullscreen mode

Variability:

const fizz = makeTransform(3, 'Fizz');
const buzz = makeTransform(5, 'Buzz');
Enter fullscreen mode Exit fullscreen mode

Cons: makeTransform is more difficult to understand. And more importantly I think this is a case of duplication being far cheaper than the (wrong) abstraction given that we know that we'll only ever need fizz and buzz.

Now one could be perfectly OK with the reassignment/mutation for the sake of a for…loop but staying with the "no destruction of values" theme:

function fizzBuzz(n) {
  return buzz(fizz(['', n]))[0] || n.toString();
}

function fizz(value) {
  const [word, n] = value;
  return n % 3 === 0 ? [word + 'Fizz', n] : value;
}

function buzz(value) {
  const [word, n] = value;
  return n % 5 === 0 ? [word + 'Buzz', n] : value;
}

function displayValues(values) {
  for (const value of values) console.log(value);
}

displayValues(Array.from({ length: 100 }, (_v, i) => fizzBuzz(i + 1)));
Enter fullscreen mode Exit fullscreen mode

Sorry, as far as I'm concerned forEach is an imposter HOF (higher order function), so I'll prefer for…of


Now back to our originally scheduled programming…

Juxtaposing place-oriented (PLOP):

function fizzBuzz(count) {
  for (let i = 1; i <= count; i++) {
    let word = '';
    if (i % 3 === 0) word += 'Fizz';
    if (i % 5 === 0) word += 'Buzz';
    console.log(word || i);
  }
}
Enter fullscreen mode Exit fullscreen mode

with value-oriented (VOP)

function fizz(value) {
  const [word, n] = value;
  return n % 3 === 0 ? [word + 'Fizz', n] : value;
}

function buzz(value) {
  const [word, n] = value;
  return n % 5 === 0 ? [word + 'Buzz', n] : value;
}

function fizzBuzz(n) {
  return buzz(fizz(['', n]))[0] || n.toString();
}

function displayValues(values) {
  for (const value of values) console.log(value);
}

displayValues(Array.from({ length: 100 }, (_v, i) => fizzBuzz(i + 1)));
Enter fullscreen mode Exit fullscreen mode
  • For one we have to acknowledge that if we learned and primarily practice imperative programming it's going to seem "easier" to us than approaches from other paradigms (for example imperative programming doesn't really prepare you for SQL).
  • FizzBuzz is a small enough problem that it Fits in Your Head as is. Real problems tend to be much larger which is why we distribute functionality over multiple functions or objects. However even when partitioned, reasoning about code which freely interacts with other parts of the system (or "the world") via mutation of shared data and side effects can be difficult primarily because the "state" of mutable things varies with time. For example in the PLOP code word isn't just a simple value but it potentially keeps changing (though given that it's not exposed outside of the loop it isn't a big issue in this case).
  • Looking at the VOP code: It may initially take some time to get your eye in but fizz is extremely simple: it either returns the original [word,n] value or a new [word + 'Fizz',n] value depending on the value of n. Now fizz isn't as descriptive as on3AppendFizzToWord but in this context it's likely good enough so that we can forget about the actual code and just know what fizz is about.
  • Similarly for buzz as it fits exactly the same pattern.
  • In fizzBuzz the compactness of buzz(fizz(['', n]))[0] could be an issue - I think (['', n] |> fizz |> buzz))[0] would be easier to read, i.e. transform ['',n] through fizz and buzz and use the first element of the resulting value. But again fizzBuzz only uses a single line of extremely simple functions and operators. Once parsed you can forget about the code and know what fizzBuzz means.
  • fizz, buzz and fizzBuzz are pure functions and therefore referentially transparent. referential transparency is the property of being able to replace a function with its return value for a given input; functions that depend on other factors are referentially opaque, so:
    • referentially transparent -> simple
    • referentially opaque -> not so simple
  • The idea is to stick to building "simple" functions as building blocks and create other (simple) building blocks by composing them - yielding functions capable of complex value transformations.
  • Aside: Scott Wlaschin views functions as Interfaces (i.e. simple interfaces). With that view functions should be easier to compose because object interfaces can be a lot more complicated.
  • displayValues has side effects. But it doesn't have to know anything about fizzBuzz. These type of functions are necessary but ideally we should separate them for easy identification. Ideally a system should be organized as functional core and imperative shell (Hexagonal architecture). The functional core should be easy to test as it just transforms values.

The idea is that VOP scales better than PLOP for larger problems because it can stick to simple building blocks while achieving complex value transformations through composition.


If you're interested Refactoring JavaScript goes through typical tactics used to improve JavaScript code. It discusses OO-practices as well as functional practices. The functional refactoring largely revolves around avoiding destructive actions, mutation, and reassignment.