Thanks, you're right this is not that weird, i wanted to make this as a series, i don't know what i will post next but i definitely consider "why 0.1 + 0.2 = 0.30000000000000004" which is lot weirder than this
It's not that weird actually. It's just how computers handle floating numbers.
Squeezing infinitely many real numbers into a finite number of bits requires an approximate representation. Although there are infinitely many integers, in most programs the result of integer computations can be stored in 32 bits. In contrast, given any fixed number of bits, most calculations with real numbers will produce quantities that cannot be exactly represented using that many bits.
Nice write up! I wouldn't call these behaviors weird though, this is by far the least weird behavior of JavaScript
Thanks, you're right this is not that weird, i wanted to make this as a series, i don't know what i will post next but i definitely consider "why 0.1 + 0.2 = 0.30000000000000004" which is lot weirder than this
It is weird. I'd love to know more about this!
It's not that weird actually. It's just how computers handle floating numbers.
Excerpt from this post: docs.oracle.com/cd/E19957-01/806-3...
Really interesting read, I had not read much about this whole topic!
Yeah, it feels weird when you see it first, after you've learned how numbers work in JavaScript, everything makes sense.
youtube.com/watch?v=MqHDDtVYJRI
This video really helped me to grasp how numbers work in JavaScript,.
This looks really interesting I will read this, thanks.