How to distinguish over-hyped technologies (which gonna die) from innovative technologies which gonna stay?
Yes, I know:
it is better to choose ...
For further actions, you may consider blocking this person and/or reporting abuse
This is something I've written about before.
All The Cool Kids Are Doing It
Jason C. McDonald ・ Nov 22 '17
Also, technologies virtually never "die", and we need to stop using that term altogether...
Retraction of an Obituary
Jason C. McDonald ・ Dec 28 '18
Re: All The Cool Kids Are Doing It
Yes, very related. How to distinguish hype (cool kids doing it), from innovation. Blockchain - hype. JavaScript - here to stay. Neural networks - probably here to stay, but get overused and work like a buzzword for VC.
Re: Retraction of an Obituary
I agree that dead means a different thing for different people. I would say dead - means there is no sense to learn it for a newcomer. If you already know the technology, there will be always a requirement for it. For example, if you know COBOL, there are probably some old companies who still need to support those things. But if you just starting your carrier it makes no sense to learn COBOL. Learning Python, JavaScript, etc. would make more sense - it would be easier to learn because it would be easier to find a mentor, to find a community, to find a job.
I need to say that I don't consider the following technologies dead: C, C++, Java, Ruby on Rails. But as well I would consider alternatives, like Rust, Go, Phoenix (Elixir).
On the contrary, some of the most stable and highest paying jobs go to developers who can maintain legacy systems. There are many among us who actually enjoy this! (And it's easier for a COBOL developer to find good employment than a JS developer, in some cases!)
So your statement is both patently false and dangerously misleading, albeit unintentionally. It's helpful to encourage them to learn modern languages and practices along the way, but one should never ever ever EVER dissuade someone away from a technology or subspecialty they find interesting. (Besides, the old often informs the new more than we care to admit.)
If someone wants to start with COBOL, there's actually no harm in it. It's most important starting out to learn the fundamentals of programming. Each language presents a curve anyhow.
But is there a high number of such positions? Probably not. Which means that your choice is limited, which means that would be unsuitable for some people, for example
Supporting legacy system is two sided sword. Yes it can be high paying. But then one day they will fire you and it can be very hard to find a new position for that technology.
Those high paying positions to support legacy systems probably looking for seniors. So if newcomer starts to learn outdated technology it may be hard for them to find a job.
You really would be surprised. The job market is far more varied and intricate than you would expect. Job posting represent an extraordinarily small segment of positions.
Once again, not necessarily.
Becoming well versed in a variety of technologies is always important. Only knowing COBOL is just as unwise as only knowing Javascript.
Don't lose the more important point in the midst of it: technologies don't die. Any technology you are interested in, learn it. If it does what you need, use it. And for the love of all things digital, don't try to steer people away from the technologies they like into yours.
I may have skewed picture. but I monitor job market regularly.
For a hobby, for self development, for fun - yes sure. For a higher chances to get hired - maybe not
Agree
It's not possible to monitor things that realistically. All you can do is spot trends in posted jobs, nothing more. You're taking measurements of the tip of the iceberg, which while interesting and helpful, has no correlation to the majority of the reality.
If you're learning anything purely to get a job, lacking any appreciation of it, you're doing it wrong. The last thing you want is a career working in a language you resent.
And, once again, you don't know how COBOL experience may impact someone's chances with a particular job. You're trying to replicate a very intricate painting with exclusively broad strokes, here. ;)
Nope. There is nothing wrong to learn programming simply to get a job, without any appreciation of it. People simply want to get stable job to feed their family.
I mean there are people who can afford to learn programing, because they appreciate it. But as well there are lot of people who are simply for money.
I am speaking about learning a particular language or tool, not learning programming as a whole. Two different points; you have addressed only the point I did not make.
There are hundreds of languages, and thousands of technology stacks, from which to choose. One should not feel obligated to pick up and master Language They Hate because someone (you) told them they couldn't find work in Languages They Love, when in fact, there were jobs had they built the skills.
On COBOL I suggest It’s COBOL all the way down - it's definitely not dead and some companies are trying to train people either to maintain or to modernize the code
Note that simply learning COBOL won't get one any far. Mainframes use a completely different development workflow, for example. Their OSes are also unlike anything found in daily life now, starting from unusual terminology.
People who make lots of money maintaining legacy systems are so valuable precicely because they know how to maintain the whole system.
Not to dissuade anyone from learning that, but it's much more than just learning a new language.
Are you sure these are the right dimensions? over-hyped/going-to-die vs. innovative/going-to-stay? I don't think so.
Flash was innovative! Adobe provided a technology to build dynamic web content long before HTML5. They just had the right product at the right time. But since everybody (speaking of internet/technology companies) wanted to build dynamic web content without relying on a proprietary product, it was doomed. It was just a matter of time before something better was developed. And then it took some more years to migrate all the Flash sites to HTML5.
The main benefit of Coffeescript was that it was integrated in Rails, when Rails was the most popular web framework. Was it innovative...? Well, I wouldn't say that... it was more like a more suitable fit for Ruby than JS. Was it a trend? Yes, I'd say so.
Do you know Gartner's hype cycle? Their standpoint is: every successful technology is being hyped in its early days. The expectations grow to an unrealistic level, no technology can satisfy. Next phase is the "trough of delusion". The failing technologies never recover from it, but the successful ones come back and prove their usefulness in successful products.
Thanks for your reply. Good points.
Totally true. Also as soon as something "better" appeared, it kind of died very fast. It was innovative, but also not unique enough to stay.
So true. It was trendy but didn't have any real advantages except Ruby-likeness. As soon as people considered something beyond RoR it kind of lost its value in peoples eyes
Nope. Thanks for the pointer
It depends how many developers are taking technologies into their projects JS is here to stay DevOps is now trending but the core Linux is not going anywhere so is the JAVA and C.
Programmer must be problem solving rather than specific TechStack person which will definately fade out in future.
Interesting article that would be great to expand on further, especially around the psychology of it all. The ideas in Polymer became native technology (in all the evergreen browsers nearly a year ago, starting significantly in 2016 in Chrome, then Safari, Firefox and recently Edge) that now largely support the platform Salesforce uses as well as ServiceNow which drives numerous ticketing applications used widely (Apple, etc--you've probably used their product and didn't realize). I think ING bank and others are basing their ui platforms on native WC as well. To say Web Components aren't mainstream is more specifically to popularity among commodity frontend development. Most bootcamp grads just know React and for some reason they keep teaching it--further filling the labor pool with an oversupply of this knowledge and undersupply of what will move into the future. You can actually build React-like UI's without any library at all, and even remove aspects of using React while still using its features (state management, event delegation, etc) and removing the build process. So while native tech roars ahead most of the market digs deeper into the fairly specific and moderate overhead associated with React. It's a bit odd. But I suppose we go with what we learn initially and don't typically look past that. It seems most of Silicon Valley startups are entirely React, simply due to the belief by founders that's the only way to make something. In Germany there's quite a lag in adoption of 5-10 years. In Spain and the Netherlands Web Components have taken off and it's common to find LitElement/lit-html. I've wondered for a long time when SASS will die off since modern CSS and JS essentially make it irrelevant. For some reason people prefer to keep these older technologies around, presumably it provides some comfort regardless of utility. React specifically is based on notions from when it was made around 2013 relating to browsers and performance issues in the past that don't generally exist anymore. Unfortunately modern work seems more like cargo cult than anything else. I had a manager say in passing that web development is about 30 years behind most of software engineering, which largely speaks for itself.
All of the examples you mentioned had their moments (except web components, but it doesn't mean that they are dead, actually they are the "future").
This is a hard question to answer and mainly because the front-end technologies change too fast. Javascript is now continuously evolving, this brings new ideas to people and lets them create new stuff that could become trendy.
It's a common practice to wait a little bit when there is a new trendy lib before implement it or use it in a big project. It's better to wait for the feedback of the community and see if it's actually a good idea to use it.
Web Components didn't reach stability until 2016 when v1 was released in Chrome and shortly after steadily rolled out in Safari, then Firefox and now Edge. In 2019 (last year) LitElement (and previously lit-html) came on the scene and outperform React with a nearly identical features set. Whether this gets notice or not broadly, especially in bootcamps and the more trendy startup world is entirely unclear, however it's quite established in the corporate world. ServiceNow's ticketing system (eg Apple's ticketing system, among others) implements (a weakly attempted functional wrapper around) web components, ING bank, Salesforce. Given the shift in browsers lately this is likely to gain inroads where easier development and lower overhead is preferred. It is literally easier to debug, create and deploy, as well as perceptibly faster and smaller than React in all the cases I've seen so-far. Will be interesting (for me anyway) to see where this goes.
R.I.P. Flash games.
Just found this talk: Why Isn't Functional Programming the Norm? – Richard Feldman. And author shows how and why some languages got popular.