DEV Community

Ardi
Ardi

Posted on

Don't learn C to learn how computers work

There is a murmur in the tech space lately that you should just learn C in order to learn how the computer really works. Don't do that. Read a book.

C is supposed to be low level, you code for the hardware - you tell C what to do and it translates nicely to machine code that you can understand and be happy in a rainbow world full of computer candy :)

WRONG! You don't code for the hardware - you code for the Abstract Machine. What your program does is most definitely not what the hardware does!

Ardi are you insane? What are you talking about?

What do you think happens when you have a char - an 8 bit signed integer - with the value 127 and you add one to it? If you paid attention in class you might know that with 2's complement it should loop around to -126. And you would be right! That is, if C actually cared about the hardware. It turns out that for historical reasons that is undefined behaviour(UB), which means that it could actually wrap around, optimize your code, or even make demons come out of your nose.

Or if you have an array and you mistakingly access out of bounds, that's also UB. In fact, there are A LOT of behaviours considered undefined in C that you could code in assembly (probably by mistake though).

There are a lot of web developers that don't understand how memory works and that should be addressed, but telling them to just learn C is not enough. Having a basic understanding of how the computer works can definitely be achieved by some basic reading and yes - maybe write some C, implement memcpy and a toy malloc but C shouldn't be the goal, only the medium.
Read a book that teaches you these concepts, don't just learn C.

The same can be said for operating systems but honestly even Node.js allows you to work with raw files.

Top comments (0)