It's a bit more subtle, but it's certainly part of the problem. From another angle, abstracting the hardware resources is the way to get super portable code, so there's a tradeoff even if you don't consider developer productivity.
Just to clarify, I'd argue it's the behavior encouraged by high-level languages that's more dangerous than the language's performance characteristics. As a very small example, some garbage collectors can have negligible overhead, but kill you when you hit bad edge cases. If you didn't know that, you'd never think of it; and a garbage-collected language does exactly that, encourage you to forget the garbage collector exists.
That's the thing with abstractions: they have their costs, but at the same time they reduce the effort of resolving lower-level problems which enables us to address higher-level problems.
Computers as a concept work like this:
we create devices to do calculations in a mechanical way, so we don't have to do it manually
we reduce the mechanics to electron movements so that we need to move less stuff around
we created bytecode so that we don't need to manually move electrons around
we created low-level languages so that we don't need to manually create our bytecode all the time
we created high-level languages so we don't need to continuously think about memory allocation and memory management, and can instead direct our capacity for cognitive load towards logic and architecture
we created libraries, so that that we can substract lower-level logic from our cognitive load and focus more on higher-level architecture
we created frameworks, so that we can substract boilerplate higher-level logic from our cognitive load and focus more on business-specific architecture
But with that, then we should technically all be programming in assembly then. Find your target machines instruction set and let's program using that.
When C came out it was considered a high level language. You would use C but never for anything that mattered in speed. In time that turned. Same with C++ when it came out.
There are PLENTY of systems you use every day that have great performance but are using high level languages such as Python, C#, and more.
The thought that a high level language can't make code that is optimized is old and should be thrown away in my mind.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
This is due to the fact that we use high-level languages where CPU and RAM are totally abstracted.
It's a bit more subtle, but it's certainly part of the problem. From another angle, abstracting the hardware resources is the way to get super portable code, so there's a tradeoff even if you don't consider developer productivity.
Just to clarify, I'd argue it's the behavior encouraged by high-level languages that's more dangerous than the language's performance characteristics. As a very small example, some garbage collectors can have negligible overhead, but kill you when you hit bad edge cases. If you didn't know that, you'd never think of it; and a garbage-collected language does exactly that, encourage you to forget the garbage collector exists.
That's the thing with abstractions: they have their costs, but at the same time they reduce the effort of resolving lower-level problems which enables us to address higher-level problems.
Computers as a concept work like this:
But with that, then we should technically all be programming in assembly then. Find your target machines instruction set and let's program using that.
When C came out it was considered a high level language. You would use C but never for anything that mattered in speed. In time that turned. Same with C++ when it came out.
There are PLENTY of systems you use every day that have great performance but are using high level languages such as Python, C#, and more.
The thought that a high level language can't make code that is optimized is old and should be thrown away in my mind.