I'm wondering what you mean by "low level" as opposed to "high level" as I hear of it used a lot.
Both terms are kind of relative, but basically it means the closer to the computer hardware you get the lower you go.
So when talking about languages, machine code (binary) is the lowest level language, assembly is a slightly “higher” but still extremely low, C is an even “higher” level language, but still relatively “low” compared to Ruby or Python which both take care of things like garbage collection and memory allocation for you.
I used it to refer to the actual hardware (RAM, CPU, etc.), which is about as “low” as you can get.
Also, like I mentioned above, it’s usually used as a relative term, which sometimes makes it hard to understand. Someone who works in Python might refer to C as a low level language, meanwhile someone who works in assembly might call C a high level language. What they really mean is that it’s a “higher” or “lower” level language than what they’re used to writing.
Thank you for the detailed and comprehensive explanation! Term demystified :D
Maybe it's a bit of intergenerational tension but I for some reason think of C as "my dad's time" but of course people still use it. It's probably because my dad really hoped I would learn C when I was younger.
We're a place where coders share, stay up-to-date and grow their careers.
We strive for transparency and don't collect excess data.