DEV Community

Web Dev Ken
Web Dev Ken

Posted on • Edited on

Why JavaScript Numbers Are Not Precise

You might have already heard not to do mathematical calculations with JavaScript. Atleast if you need precise results.

But there are whole numbers, decimals and floating point numbers in JavaScript. Why can I not calculate my mathemtical formulas with it?

At first glance, it might look like JavaScript displays large numbers correctly. Mostly it does, but then mostly it doesn't. Why is that? There is actually a pretty concrete reason from the perspective of a computer, but we will get to that in a moment. Let's first look at some examples:

const divide = 15 / 3;

console.log(divide);
// 5
Enter fullscreen mode Exit fullscreen mode

As in the example above, 15 gets divided by 3 that results in 5, if we log the result to the console. Everything fine so far. Let's do a more complex calculation.

const divide = 15 / 0.1;
console.log(divide);
// 150
Enter fullscreen mode Exit fullscreen mode

In the calculation above, we divide 15 by 0.1 which results in 150. Still no issues, right? Let's make the divisor smaller and observe what happens.

const divide = 15 / 0.01;
console.log(divide);
// 1_500
Enter fullscreen mode Exit fullscreen mode

All OK.

const divide = 15 / 0.001;
console.log(divide);
// 15_000
Enter fullscreen mode Exit fullscreen mode

Still OK.

const divide = 15 / 0.0001;
console.log(divide);
// 150_000
Enter fullscreen mode Exit fullscreen mode

Hah! No issues, give me more!

Note: I am using thousands step notation like "1_000" for better showing the growth. The console.log actually doesn't display numbers that way. Writing numbes that way however is absolutely valid in JavaScript.

Until now everything seems to work as expected. The next line will change your expectation drastically:

const divide = 15 / 0.00001;
console.log(divide);
// 1_499_999.9999999998
Enter fullscreen mode Exit fullscreen mode

What happened? We expected 15 divided by 0.00001 to equal 1_500_000 but actually received 1_499_999.9999999998. What's going on? Let's explain that.

The Explanation

Not only JavaScript, but also other programming languages have sometimes issues in displaying large floating point numbers. Consider the number PI, which is something like 3.141592653589793... and so on. Pretty much every programming language has problems displaying the full number PI, but why?

The reason is pretty simple: Memory space. Holding a large floating point number actually needs a lot of memory space to display it as accurately as possible. Most programming languages have therefore agreed on a strategy to solve this problem. They either round the value at the last comma place, once it reaches the end of space, to fit back again (also called approximation). Or they use a special type like BigDecimal in Java or BigInteger in C#, to reserve more memory space than the normal number type like int or double.

When it comes to JavaScript, it runs in an environment where it has certain limitations and design choices baked into it. One design choice why it results in even less accurate numbers is, that whole numbers, negative numbers and floating point numbers all have to fit in 64 bit memory space. You can imagine that shortenings had to be made to actually fit all these numbers into a limited space of 64 bits.

Source: Values, Types, and Operators

How should I display large numbers in my application?

The answer is easy, you don't - if possible. Because eventually you might run into inaccuracy very quickly. If you have an application with a backend that uses a programming language that can handle larger numbers, you could use JavaScript to only show these numbers as strings. Or you could use JavaScript NPM packages like decimal.js or bignumber.js, that can handle large numbers a little better. At the end you have to acknowledge that JavaScript has these limitations.

Top comments (0)