Tell me if you have heard this premise before:
The poster, Abisekh subedi is dealing specifically with the paralyzing uncertainty that comes with breaking into software development or taking an career leap. There are so many options, and plenty of devs to tell you why each is the best. How can a newer dev possibly be sure of their choice when every one is met with a million voices prescribing an alternate path, pointing you to another, only to find the same voices when you venture down the other direction.
I thought this comment was the perfect response:
The solution is simple.
Ignore literally everything people have ever told you to learn, pick one programming language that looks interesting, and Google for its official tutorial.
Can't decide? Then just write all of the languages you're considering on slips of paper, put 'em in a hat, and draw one.
Above all, just learn a language. Popularity be darned.
I'm not sure the official tutorial is always the right place per se. Your mileage may vary on that part, but the advice is appropriate: There are many correct answers in software development. Diversity of languages and domains means there is opportunity for growth in many directions and your choice isn't that important.
In addition to ample career potential in mastering a pretty much any technology in our field, the important part about learning a language, popularity be damned is that the act of learning something all the way through is the actual skill you'll be using throughout your career. There are many stages of learning and making use of the tools in software development including reading the docs, debugging your issues, combing through codebases, contacting members of the community, attending workshops and conferences, blogging and teaching others and everything in between. Gaining experience in all of these areas is what you are really learning. The technology is simply a detail at this point. You are learning to learn.
All this being said, you need not go in completely blindfolded. Do some basic advice-seeking and figure out what fits your approach and feel free to keep experimenting as you go, but in Abisekh's case and many others, they have some idea of what they might be into, and at that point they can jump in and get going.
But aren't there some dying technologies I should avoid?
There are a few languages and domains truly fading away, but the rumors of most software demise is grossly exaggerated. A lot of times "dying" just means "mature" or "boring". Boring is often a good thing. I write a lot of Ruby code, and I've had my concerns about its future place in the word, but I've come to realize that its benefits are as true today as they always were and the future is bright—even if it's fading away from the front page of Hacker News. It seems like Ruby when from "hectic and unstable" to "old and dying" with about five minutes of prominence in-between. But this is all a mirage I assure you.
Here are some great comments from a thread I started on the topic a little while back:
I think the one thing that will always keep Ruby around is how easy it is to start off with when you're first learning to code. The readability of it, its elegance, the Ruby community -- these things lower the barrier of entry when it comes to being able to pick up a language, its associated framework, and the different contexts in which it ends up being the right tool for the job. It makes it really easy to hit the ground running and start being productive, particularly if you're still fairly new to the field.
Ruby's decline was predicted in 2006 in a blog post by David Megginson (a Python programmer) about "the programming language cycle". In that blog post, David outlined a theory that programming languages are invented by elite programmers trying to differentiate themselves from the 'riff-raff'. David concluded his blog post by writing:
The final and most important point here is that a programming language’s perceived coolness will always suffer from its success. Java cannot possibly still be cool when there are thousands of regular developers slaving away in the bowels of ACME Widgets using it to write enterprise applications. If, in fact, Ruby displaces Java in the enterprise (which may not happen, since Ruby has no advantage over Java to match Java’s memory-management advantage over C++), it will suffer precisely the same fate, and we can expect Bruce Tate to write a book Beyond Ruby in five years or so.
By that measure, Python’s very failure is a kind of success — as long as it never really becomes takes hold in the workplace it will always carry a small degree of distinction with it, and at least a few elite developers won’t feel pressured to move on. Like a movie or band that never becomes too popular, Python will hang onto its snob appeal.
The implication of David's blog post though is that new programming languages aren't solely (or even primarily) motivated by technical concerns, but by elite programmers' desire to retain their social standing. So even though Ruby may have "declined", it is still a great language...just like PHP...just like Java...just like C++...etc., etc..
It's not going to be easy, and the word just should perhaps be avoided when describing a lengthy and taxing journey. Basically this is what I'm talking about: