let a = '2'; let b = 1; console.log(a > b); // this prints true
From the above example, we can see that we’re comparing two variables of different data types, a string ‘2’ and a number 1. However, JS is still able to calculate that 2 is greater than 1 and return the result as true. This is because when comparing values of different types, JS converts the values to numbers and then compares them. In the above example, the string ‘2’ is first converted to the number 2 and then compared with the number 1. This results in the statement returning true.
console.log(true == 1); // this prints true console.log(false == 0); // this prints true
Here, true equates to 1 and false equates to 0, in their respective number conversions. A good thumb rule is to remember that all truthy values convert to the number 1 and all falsy values convert to the number 0.
Now let’s try to see an interesting consequence of the above two examples. Consider the following code
let a = 0; let b = "0"; console.log(Boolean(a) == Boolean(b)); // this prints false console.log(a == b); // but this prints true
Boolean(a) = Boolean(0), which equates to false, since 0 is a falsy value. Boolean(b) = Boolean(“0”), which equates to true since any non-empty string is a truthy value.
Hence, (Boolean(a) == Boolean(b) returns false.
However, a == b returns true since the “0” value of b is converted to the number 0 and then compared with the 0 value of a.
There’s an issue with using == when comparing a few values.
console.log(false == 0); // this prints true console.log(false == ''); // this prints true
For example, the == operator cannot differentiate between false and 0 since both of them are falsy values and equate to 0 in their number conversions. The same holds for false and empty string as well.
The above conundrum is solved by using the triple equals (===) operator. The difference between the triple equals and double equals operator is that the former does not do any implicit type conversion before comparison. In other words,
console.log(false == 0); // this prints true console.log(false === 0); // this prints false
In the above example, the second statement compares false directly with 0. Hence, the result of the statement prints as false.
Any comparison between the values of different data types returns false by default when using the === operator. This holds true for !== as well.
console.log(null === undefined); // this prints false console.log(null == undefined); // this prints true
However, this applies only to the == operator. All other operators return false when comparing null with undefined.
console.log(null > undefined); // this prints false console.log(null < undefined); // this prints false console.log(null >= undefined); // this prints false console.log(null <= undefined); // this prints false
10 > 9; // this returns true; "10" > "9"; // this returns false;
"10".charCodeAt(0); // this returns 49 "9".charCodeAt(0); // this returns 57
Since the ASCII code of “10” is 49, which is lesser than the ASCII code of “9” which is 57, hence it is considered as the smaller value.
P.S. I shall be updating this article with other quirks, as I encounter them. Till then, happy coding!
This post was originally published here on Medium.