In conversations with people who are much smarter about education than am I, one idea that came up was this notion of foundational knowledge. To define "foundational knowledge" by analogy think of learning basic arithmetic. First you learn addition then subtraction, multiplication and finally you move on to division. I can't claim to understand the underlying pedagogical theories but it seems to me that if a child hasn't mastered addition there's little point in him or her moving on to subtraction etc. And you sure aren't ready for algebra if you can't deal with basic arithmetic.
In the same way, I've often thought there should be some sort of basic curriculum for software developers. Some sort of foundational knowledge which must be mastered before one can progress to harder subjects. I've seen lot of places teaching "Principles of Object-Oriented Programming" while never concerning themselves with whether or not a developer can competently write a loop. To me being able to write a loop is the developer equivalent of being able to master addition. (Feel free to insert charges of me "gatekeeping" here.)
Now I know there are some folks reading this who will say that many developers will never need to concern themselves with writing loops. And that's probably a fair criticism. However, some knowledge is foundational to other knowledge. Could a student of mathematics understand the notion of a summation if he or she didn't understand the basic notion of addition?
Even beyond the question of certain pieces of knowledge being "foundational" there's the question of thinking in a computational way. While human beings are capable of leaps of intuition (i. e. arriving at an answer while not having to process all the intervening steps) computers are not. In order to program a computer we need to tell it all the steps needed to reach a solution. In fact it's the very human trait of glossing over non-essential details that makes writing software hard.
Let's put this to a more concrete conversation. If I said to you I need a program to read a file of totals and find the largest dollar amount in the file, if you don't think of using a loop (or some language library that contains looping logic underneath) you're doing it wrong.
The pseudocode for such a process would look something like this:
Set dollar amount to zero
Read line from file
Parse dollar amount from line
If dollar amount is greater than stored dollar amount
set dollar amount to new dollar amount
If next line is available go to :start
Now I would need to translate that into language specific calls but that's essentially what I would need to do. Do I care about a
File class (in that classic OOP sense)? Well depending on the language I may. Do I care how I store
dollar amount (i. e. integer or float)? Again, depending on the language and the problem I may. But the most important thing is to think in terms of a loop to solve the problem.
Of course, my pseudocode doesn't take into account any potential error conditions; hence it's foundational and other code would need to be added to it.
My point is that basic sort of computational thinking is quite separate and apart from language paradigms (Imperative vs. Functional; OOP vs. Procedural etc) and it's fundamentally harder to master in some way. If a developer cannot master something this basic though, he or she is bound to fail eventually because this sort of thing is going to come up. And even if the language's library can hide the details from you, you still need to have a notion of what's going on if you have to debug this.
If you want to be a better developer, first insure that you've mastered these foundational concepts of software development. This is not to diminish the difficulty of learning more advanced concepts; it's simply to say that, to me anyway, too often developers leap to advanced ideas without mastering basic ones and their work suffers for their lack of mastery of basics.