DEV Community

Discussion on: The Dark Side Of The Magic

Collapse
 
codemouse92 profile image
Jason C. McDonald • Edited

Four counter-counter-points:

  1. Bad code is unmaintainable and bug-prone, which costs exponentially more in hours, money, downtime, and lost productivity. There are entire studies documenting just the quantifiable cost of this. Read "Dreaming in Code" by Scott Rosenberg for a thorough overview. I can't do it justice in a comment.

  2. There's a profound difference between deferring and avoiding. You don't need to learn everything now, but you should learn these things eventually. I explicitly addressed avoidance.

  3. Abstractions often break, conceal inefficiencies, and/or unexpectedly mismatch logical intent. When that happens, only those who understand said abstractions can fix their code. Every programmer will slam into this painful reality sooner or later (or else live in denial of it and ship unusable code.) Again, see the difference between deferring and avoiding.

  4. There are no shortcuts to becoming a good programmer. You have to put in the time and effort. Those who skip all the hard work eventually find their lack of drive has closed doors. Someone who has been ostensibly coding for eight years, and yet lacks much of the general knowledge of their peers, will be regarded as a burden to any development team who has to carry them up. That is no way to build a rewarding career.

If you're two years into coding, I don't expect mastery of any of the above by any means! Simply moving towards it is all that can be expected. But if you're six years into coding, and in the same place you were when you started, there's a profound problem. The entire point of the self-examination I called for is to prevent that stagnation.

There are different opinions on all subjects, but these points are beyond opinion. I'm simply restating truths that have been proven time and again. For their efficacy, I appeal to the entire history of software development.

P.S. For decades, many have appeared claiming that "what's right in programming" is somehow changing. They've invariably been proven wrong every time. Abstraction doesn't change the underlying logic. The car doesn't replace the engine.

Collapse
 
stereobooster profile image
stereobooster

P.S. For decades, many have appeared claiming that "what's right in programming" is somehow changing.

I mean Turing was right no arguments about that and deep nature of computation doesn't change. But "best" practices, languages, paradigms changes a lot. At some point people were like SQL is dead, because it doesn't scale nosql is our saver, and then google made spanner (which is possible only due to huge progress in our field, they need to make custom clocks to make this thing work)

Thread Thread
 
codemouse92 profile image
Jason C. McDonald • Edited

Those are still only abstractions around the same logic, and many of the arguments for or against any of those practices or technologies are rooted in an understanding of said logic.

No matter what costume you put the duck in, it's still a duck. We should not pretend it is now a moose.

Thread Thread
 
stereobooster profile image
stereobooster • Edited

Those are still only abstractions around the same logic

This is oversimplification, if we will follow this logic we all can write programs in assembly (or brainfuck, which is Turing complete). Why even bother with those higher level languages and abstractions?

Thread Thread
 
codemouse92 profile image
Jason C. McDonald • Edited

if we will follow this logic we all can write programs in assembly

...and we literally could. Albeit, it wouldn't be a good use of our time.

Why even bother with those higher level languages and abstractions?

Because they allow us to save repeated implementation effort, not because they prevent us from thinking about the underlying concepts. For example...

  • When I use the sorted() function in Python, I know I'm using Timsort, which has a worst-case algorithmic efficiency of O(n log n). Whether that has an implication on what I'm doing depends on the situation, but I'm not in the dark about it in any case. (The abstraction means I don't have to reimplement Timsort myself.)

  • If I'm storing the high and low temperature of the day to a single decimal place, and I'm working in C++, C, or Java, I will use float instead of double. I know I don't need double precision for a number which is merely meant to be scientifically approximate, so I don't want to lazily waste the extra memory or processing time. (The abstraction means I don't have to shove the bytes into memory myself.)

  • If I'm writing a function that takes a list as an argument, I need to be intentional about either my use or my avoidance of side effects, and that requires an understanding of how the list is being passed: Copy? Reference? Assignment? (The abstraction means I don't have to fiddle with registers.)

  • When I am deciding between using a loop, recursion, or a generator expression (in, say, Python), I need to understand the pros and cons of each. The wrong decision here can have significant impacts on the performance of the code. (The abstraction means I don't have to mess with assembly jump instructions and/or manual loop unrolling.)

  • If I'm storing a collection of data, I need to understand how it needs to be accessed. Do I need random-access? Am I only adding and accessing values from the front or the back? Does it matter what order it's stored in? Those are just some of the fundamental differences between how a list, a stack, a queue, and an array are stored and accessed in memory. The wrong data structure will at best waste resources, and at worst introduce significant bugs. (The abstractions means I don't have to reimplement these data structures.)

In all these cases, I'm using abstractions, but I'm understanding what the implications of those abstractions are. One can afford to "wave off" a few of these things now and then, but if one habitually ignores them altogether, their code is invariably more prone to bugs, errors, inefficiencies, and maintainability problems. Wanton carelessness in these areas is why we have (for example) web pages that take up more memory than entire operating systems of yesteryear.

Thread Thread
 
stereobooster profile image
stereobooster

It's very hard to talk to you, I'm not sure are you being serious or trolling.

Following this logic we as well need to understand electronics, otherwise we use abstractions of hardware without realising implications (rowhammer attacks).

Turing tarpit it is.

Thread Thread
 
codemouse92 profile image
Jason C. McDonald • Edited

I'm quite serious, and in fact, there is a degree to which programmers do need to understand some important principles of computer engineering. (Which principles depends on which abstractions you're using, and thus, unpacking.)

As you follow each abstraction down as you encounter it, you learn more and more about the underlying principles of computer programming, and yes, sometimes even hardware. These have profoundly positive influences on your programming skills.

I think a lot of people feel defensive about this because it seems intimidating. They think "I'm not a real programmer because I have no idea what a register is!" To that, I'd say no, you ARE a real programmer. Every day will present a new opportunity to learn. You don't have to try to learn everything, nor do you need to learn it all right now.

The important point, the universal distinguishing characteristic of a good programmer, is simply the willingness to keep learning. When you realize you must make a decision about an abstraction, when you encounter a new tool, when you have a more senior developer point out a pitfall you overlooked, you take the plunge down the rabbit hole and fill in the gaps.

Moment by moment, day by day, fragment by fragment, you uncover the deeper truths underneath the "magic", and you become a better programmer for it.


Example: if you followed just one of those rabbit holes — data structures — all the way down to the silicon, you're really only going to encounter basic memory addressing and CPU caching (and maybe a bit of binary jazz). Neither is as scary or complex as they sound, and both are incredibly enlightening. Once understood, they become as elementary as multiplication.

Yet in that same scenario, understanding (say) how the power supply unit and how it provides voltage to the RAM is utterly irrelevant; it has no meaningful effect on the fundamental concept.