Blockchain...
The first time I heard of it was in relation to cryptocurrency. The second time, another cryptocurrency. And then, several times later, I heard of it in context of a "decentralized internet" project called LBRY, of which I am an early user.
I've lost count of the number of times I've heard of it since.
Now, I recognize that this word represents some hugely vital concept in programming, which I won't even begin to pretend I understand. I'm not commenting on the viability or usefulness of blockchain for any given project. But eight years studying communication in this industry has taught me how to spot fads...and this one just hatched.
"Fad" is a pretty loaded word, so let me slam on the brakes and explain myself. There are several brilliant innovations I'd classify as technology fads, but never on the basis of their technical details. An innovation only becomes a technology fad when we start mass-adopting it on any basis other than its own merit!
I can point to three other distinct fashionable technology fads of today.
For example, look at Javascript, a language whose design is widely criticized in many circles, not entirely without merit. As much as I personally dislike Javascript, I'll grant that it presently has a solid use case in web development, specifically in manipulating HTML and CSS to create smooth, interactive web experiences.
Yet it seems that a major sector of the industry is trying to replace the entire technology stack with Javascript. Applications! Data processing! GUIs! There's nothing it can't do!...or so this crazed bunch tells us. But if we're honest, Javascript is not well suited to these use cases, because no matter how many magical frameworks we pile on it, Javascript was designed for web interactivity. We're now sitting on a volatile pile of fragile spaghetti code, just waiting for an IT to sneeze before it all falls apart.
This isn't Javascript's fault. We're just trying to use it to replace C++, Python, Rust, Ruby, Perl, R, Haskell, Go, and the rest of the gang. We've lost sight of what Javascript is supposed to be. By the same rule, we can't really use any of those other languages to truly replace Javascript, because they weren't built for the same purpose!
Languages need to know why they exist, and stick with those reasons. Writing a statistical computing program in C++ is (usually) as silly as building a game engine in R. You could probably manage to pull off both, but you'd be wasting a lot of energy and time for a suboptimal result.
In other words, Javascript's problem can be summarized in the words of poet John Lydgate: "You can please some of the people all of the time, or all of the people some of the time, but you can't please all of the people all of the time."
If we move away from language debates, we won't go far before we come across some mention of the cloud. It wasn't long after this mystical land of water vapor and data was first mentioned that some astute IT pointed out...
The cloud is just someone else's computer.
The cloud offers some very neat innovations. My company's website migrated from traditional hosting to a Linode VPS, and we couldn't be happier. Some software and services are empowered by an intelligent use of cloud computing and VPS technology (two different concepts that we keep lumping under the same cover-all name.) However, the cloud doesn't always have a silver lining.
One online friend described how his company's CTO decided to migrate their entire infrastructure to "The Cloud," with little more explanation than some empty tech babble he no doubt picked up by skimming an article on Slashdot. His plan held less water than the actual clouds over the tech office, but he was obliviously plowing forward with his plan, deaf to the warning cries from half his IT staff. He was insisting on migrating an entire, humming, live infrastructure, databases and all, from their in-house servers to AWS because he had heard the siren-song of The Cloud. The functional infrastructure was to be dismantled, and rebuilt in AWS, with no gain in functionality, a significantly higher operating cost, and years of fixing the problems that come from hammering a square peg into a round hole. And all the IT department wept.
I've also watched programmers "solve" problems in relatively minor app projects by using "The Cloud," when local based solutions were faster, cheaper, and more efficient. This included one case, I forget the technical specs thereof, where a college senior project set up no less than three cloud-based microservices for a smartphone app that, in retrospect, didn't really need any.
I can still hear the marketing exec who dreamt up the term "the cloud" cackling on his way to the bank.
Finally, we have neural networks. Again, I'm not going to claim expertise here, although I'll admit that I get a little excited seeing some of the stuff they manage with these! AI can pull off some amazing feats with neural networks and machine learning.
However, I'd also classify neural networks as a fad. I can be found lurking on Freenode IRC nearly 10 hours a day, six days a week, and I can't tell you how many times I heard people mention using neural networks for purposes they are entirely unsuited for. It was as if someone mentioned it on Reddit, and now every basement programmer wants to implement their own.
Of the three, this one seems to be losing steam very quickly. Perhaps it is that you can't get very deep into implementing machine learning before you realize you're in over your head, and you could achieve just as good a result in your Yet Another Number Guessing Game with a simpler algorithm.
I think blockchain may be poised to replace neural networks in terms of technology fad rankings, and it makes me sad. There are a lot of good ideas that we need to be pursuing with this technology, but after the hordes of fair-weather fans finish with it, the tech world will be left with ringing eardrums and a bad taste in their mouth. When that day comes, it will take a brave soul to even suggest blockchain, until the technology fades into a sunset of obscurity.
You may be shaking your head at me. "These aren't fads," you might say, "and even if they are, the damage won't be that bad."
Allow me to remind you of a few fads of yesterday.
Java. Hadoop. Wordpress. Python. Joomla. Shockwave Flash. Extreme programming. Singletons. HTML iframes. Macros. Goto statements.
All of these have uses. There are, or were, proper applications for each. But the frenzied application of these technologies to every problem imaginable wore off the shine and finish, leaving them with a haze of knee-jerk repulsion. Python had to gain a whole new version to regain some of its lost relevance.
Now we have a whole new batch of ideas and technologies, some of which are already suffering the damage of fad status.
Cloud Computing. Neural Networks. IoT. Go. SaaS. Rust. Docker. Haskell. Javascript.
And blockchain.
Our innovations deserve better treatment than this. Please, for the love of all the shiny new technologies, as well as the old reliable ones, follow these three simple rules:
Know why the technology exists, and generally use it only for that purpose. We can certainly explore innovative uses of technologies, but be careful not to force a square peg into a round hole, nor to make one technology do everything.
Choose technology solely on its merit in the context of your project, and never on the basis of its trendiness, popularity, or lack thereof. FORTRAN may well bear consideration in the same breath as Clojure.
Combat fad-based technology decisions. Have a broad understanding of technologies old and new, and be generous with this knowledge. When you spot someone building a desktop solitaire game in Node.js using a neural network, remind them that Python and basic conditional statements are things.
Working together, we might just prevent the day that some technical whitepaper bears the ominous title "Blockchain Considered Harmful".
What are some fad horror stories you've encountered?
Top comments (16)
Great points. I've never gotten too caught up in fads but I have suffered from anxiety-inducing FOMO before I got over that line of thinking.
Great thoughts! It's almost as if you have to think about the problem you're trying to solve yourself and ignore which languages/tools have the most headlines at any given moment. ;-)
Thanks for this; as a fairly new developer I'm less aware of the fads of the past, and get caught up with considering learning the current fads. I generally don't because of laziness, but it's so easy for me to get overwhelmed by thoughts of what I "should" learn.
Check out this year's gartner hype cycle which puts AI at it's peak and projects that Blockchain is in for disillusionment.
cityam.com/270451/gartner-hype-cyc...
I agree that Neural Networks and Blockchain have been hyped up to a ridiculous extent, but I think there's good reason for it. Because in contrast to Javascript, Wordpress and the other languages/frameworks you pointed to, these new technologies are infrastructural. Once they mature, they will change practically every aspect of our lives, as people and programmers.
So I wouldn't call these technologies fads, because although the hype seems unreasonable, and most of these new AI/blockchain start ups will fail, in the long run these technologies are here to stay.
Well, sure, but remember - a fad isn't a fad because of itself, but because of crowd response. Python is a solid technology, and is most definitely here to stay, but it made my list of bygone fads.
Honestly, I could have also included Object Oriented Programming, Test Driven Development, and Agile Methodology as well - they all also classify as fads in how they were hype-adopted. Blockchain and AI will no doubt continue to be important, but that doesn't mean they're not fads.
Okay I'm going by the definition of a fad as being "short-lived" rather than just "over-hyped". Regardless, thanks for the article it was a good read.:)
That's actually the funny thing I discovered about the word "fad" in terms of definition - it doesn't necessarily imply the thing in question is short-lived.
Outside of programming, look at the Beanie Babies fad. They may have considerably faded in terms of hype, but TY still makes a TON of money from selling them.
Lol beanie babies were my jam one school year.
Honestly though, I don't think AI and blockchain will fade, or even slow down in the long term. AI will eventually be building software and taking our jobs, while steering our cars, babysitting our kids and even giving us life advice as personal assistants. :/
And Blockchain is the backbone of IOT, not to mention finance and eventually governance. It's only a matter of time until the tech catches up to the promise. But until then we'll have to deal with misinformed CTO's making bad decisions, and ICOs making bank off people's false hopes.
Beanie babies are still my jam :)
You said what I always think when I see a tweet about how AI will suddenly gain consciousness and take over; how we should use JS directly in CPU; how SaaS is the must-have thing for your all apps; how DevOps is the superhero that fits everywhere; and how we should use Unity3D to make multi-platform apps.
Even if I exaggerated some of these thoughts, all of them are based on true stories.
Often, we follow the rule of the golden hammer "When all you have is a hammer, everything looks like a nail"
The problem is when your "non tech savvy but thinks he is" employer goes fervently after every fad he sees, completely disregarding the measured and researched feedback of his developers. I am going through hell with this stupid "JavaScript for everything!" fad right now because of it, and I just can't wait until I get away from it. He's losing a much needed employee over his stupidity, and I'm pretty sure he's not alone in that.
I just had to Google what Blockchain is and I'm none the wiser. Did I pass?!
If you really want to know more about Blockchain, send me some bitcoin, I'll give it a look and tell you.
Kidding, here's a list of good articles here: dev.to/search?q=Blockchain
Who are the cool kids?
I was listening until you wrote Java and Python. A little googling shows they are in the top 3 of all surveys and have been so for some time, and went up over the last year.
Well, I'm not actually basing this on the surveys, which are better indicators of industrial prominence than they are of fads (thank goodness). A lot of what I wrote about in this article (which is over a year old, mind you) was based on conversations I was encountering in the programming world. I was actually hearing a lot of "Python? Bah - use Haskell" at the time. That has since switched. I think Haskell fell out of fad status for the moment.
You are correct that Python and Java are very common languages, but when I wrote this, they weren't necessarily the trendy languages. Python has had a bit of a recent resurgence in fad-popularity in the past year, especially in the data sciences. Once again, everyone wants to build everything in Python.
Meanwhile, Java continues to gain a reputation as slow and clunky. Even today, among the trendites, you're going to hear "don't use Java, use Go!" and such fad-based nonsense, and that's more my point.
I can't stress enough, fads aren't necessarily directly related to a language's industrial prominence...C++ has also been one of the top 5 languages for years, but it's NOT trendy by most assessments. That's exactly the point of the article: fads aren't actually rooted in anything more than the subjective obsession with the newest, shiniest toy.