Learning a new language takes a long time. Unless you need it for your job (or a personal project that's important to you), it's a bad investment. In this post, I'm going to show you why "learn at least one new language every year" is bad advice and what you should do instead.
As best as I can tell, this idea was popularized by Andrew Hunt and David Thomas in their book The Pragmatic Programmer (which is a great book, by the way).
In a section on investing in your knowledge they wrote:
Learn at least one new language every year. Different languages solve the same problems in different ways. By learning several different approaches, you can help broaden your thinking and avoid getting struck in a rut.
The quoted text was the first recommendation in list of things programmers can do to improve their knowledge. And it's a reasonable recommendation in the context of the book.
However, over the years, people have stripped that context away and turned it into something like this: "Learn at least one new language every year or you're not a good programmer."
And it's this twisted version that I want to talk about in this post.
Programmers, especially beginners and junior devs, have taken this advice to heart. They are running around on the internet trying to figure out which language they should learn next and stressing about the fact that they don't know enough languages. But it's all a distraction and it's hurting people and results. Let me tell you why.
Not only should you be reading blogs, writing your own blog, contributing to open source projects, networking with other programmers, working on exciting personal side projects, going to conferences, reading to keep up with the latest tech developments, and doing your actual job, you should also "learn at least one new language every year". Who needs sleep or a life, right?
New devs are almost always overwhelmed and they need help with the basics.
Do they understand how the projects they are working on deliver value to the business? Do they understand the code base? The tech stack? Can they make a useful code change that will pass code review? All these other professional development expectations are not helping new devs master the basics.
Hunt and Thomas wrote that learning a new language helps broaden your thinking so you don't get stuck in a rut. But I can think of easier ways to broaden your thinking. You could:
- learn design patterns
- study architectural patterns
- look at open source projects
- talk to other programmers
- watch video lectures online
I'm just a little fuzzy on the mechanics here. How are people "learning" these languages? Reading a book? Learning the syntax? Writing a "hello world" program? Duplicating a tutorial in the language? Writing a 1,000 line project?
In my experience, learning the syntax of a language is the easy part. If you want to write production quality code in that language you also have to learn the tools built around that language. That might mean you have to learn a new IDE. And you'll probably need to learn the frameworks and libraries for that language. And compilers, build systems, static analyzers, linters, where to get help, and how to assemble small pieces into bigger programs. Tools for testing, and on, and on.
That's a ton of work. Are people really doing all this stuff? I suspect lots of people stop somewhere between "hello world" and duplicating the code in a tutorial. And if that's the case, how useful is learning a new language?
But even if you do create a little project in your new language, I still question whether this is the best use of your time.
Suppose you invest the time required to really dig into a language and learn it deeply. How long will that knowledge remain useful if you don't use the language on a regular basis?
Four things are happening simultaneously:
- your memory for the details of the language fade every day you don't use it
- languages evolve
- the tools, frameworks, and libraries evolve
- best practices in computer science evolve
I learned Java 2 in university but a lot has happened in the last 15 years and I'm not sure to what extent I could even claim "I know Java" at this point.
If you learn a new language you can put it on your resume. It might help you get a job, right? Probably not.
Language/technology inflation is rampant on resumes.
I don't do much hiring but I ignore that stuff when I see it on a resume. I've encountered way too many people who claim expert knowledge of a language and then can't write a simple function in it during an interview. And if I'm ignoring that stuff you can bet that 99% of hiring managers ignore the list of languages on your resume too.
Could I be convinced that learning a new language required for a job you really want is a good idea? Maybe. If you frame it as "showing initiative," I might buy it. But even then, wouldn't your time be better spent practicing your interview skills? Or learning to negotiate a higher wage and better benefits?
The whole "learn a language in case you need it for a job one day" thing just strikes me as inefficient. How do you choose the right language to learn? How do you prevent your language-specific skills from degrading over time?
Read on for a better approach.
Let's unpack this.
A skill is high leverage if it has a disproportional impact on the results you care about for the amount of effort you expended to get it. Learning how to learn efficiently is a high leverage skill.
I'd encourage you to focus on learning the things that will best help you and your employer achieve outstanding results, with a preference for knowledge with a long half-life.
- really learning your stack - get super productive
- your business - where are the high leverage places to apply your effort? What are the best problems to solve?
- how to learn efficiently
- prioritization - Theory of Constraints, Cost of Delay, and CD3
- project management
- time management
- business skills - strategy, hiring, negotiation, accounting, marketing, finance, sales, statistics, etc.
- build-measure-learn feedback loop as described in The Lean Startup by Eric Ries
- development methodologies/techniques
- design/architectural patterns
Find the highest leverage thing that will help you or your employer, learn that if you don't already know it, and then use that knowledge to improve the results of your company.
Then find the new highest leverage thing and repeat.
People who do this end up with impressive resumes full of outstanding results, which is what employers really care about.
At several points in my career, the highest leverage thing for me to learn had little to do with programming. This is especially true in small companies where you need to fulfill multiple roles. Don't be surprised if you find the same thing. Get into it; have fun. There are tons of interesting things to learn in the world besides programming.
You don't need to learn at least one new programming language every year to be a "good" programmer. In fact, chasing such a goal without good reason is silly.
If you want to learn programming languages as a hobby, go for it (I've done it myself). But don't expect it to have a huge impact on your career. This is a hypothesis you can test. Just ask your employer what kind of raise your company will give you if you learn [insert new programming language here]. If the answer is none, then you just established the market value of that language to your employer.
Key points: people hire computer programmers to help them solve problems, not for how many languages they know. The best programmers see the bigger picture and prioritize what they learn so that they can deliver the best possible results. It's always the results that matter.
Agree or disagree? I'd love to hear your thoughts in the comments section.