I have been thinking about the so-called Fullstack Developer role lately, and I wanted to get some things off my chest.
I don't like the term. It is misleading, and more importantly, dangerous for entry-level developers who are joining our industry. The thing is that the term means different things for different people.
On the one hand, it is used to describe the work of those backend developers who could move up the tech stack and do frontend work with HTML, CSS, and Javascript. But, doing frontend work or feeling comfortable with it is way different from being good at it. I am not saying that these individuals do not exist, I am sure they do, but they are rather exceptions who were presented with unique projects and opportunities to expand their skill spectra. In any case, I am still skeptical that someone is good at writing multi-thread management applications and CSS. In this context, the term Fullstack Developer by definition encloses some seniority and comes with a payload of discredit and underestimation of the frontend work. It is diminishing the relevance, importance, and careful thought needed to complete professional frontends in 2021. No matter what framework or architecture you use (SSG, SSR, SPA) ... you need specialists to take care of frontend work.
On the other hand, the term's meaning started to shift recently due to the surge of serverless and infrastructure as a utility. Let me put it this way:
All architecture is design, but not all design is architecture. Architecture is when we make significant design decisions, where significance is measured by the cost of change.
Where am I going with this? With serverless and the modern cloud, developers can make significant design decisions at a lower cost of change. In the past, with physical infrastructure and even IaaS, reverting a lousy design decision meant a complete refurbish of the servers and even contacting vendors to supply new material. Now instead, if you screw up your design by using AWS DocumentsDB because it is too costly, you can switch to AWS DynamoDB with way less cost of change. Also, if you decide to write an AWS Lambda function for some piece of work and find out that it does not scale well, moving it to AWS Fargate or ECS with auto-scaling is relatively more straightforward.
What does this mean? It means that it is now way more accessible for frontend developers to step into the structural design work (aka Architecture). Does this make them fullstack developers, as many suggest? No, it does not. Similarly to my point above, if a given developer is good at CSS and NextJS and can decide how to deploy their stuff with serverless components, this does not make them a Fullstack Developer. I refuse to think that someone is still skilled at CSS, NextJS, and NoSQL database optimizations or event broking at a professional-grade level. The term here also comes with a payload of underestimating the importance and relevance of backend work.
Again, I am not saying that these profiles do not exist as individuals in particular cases. However, I am skeptical about companies who use the term extensively in their job title architecture, especially when it comes preceded with the label junior. It just does not make sense.
Since the term was originally coupled to the use of the MEAN/MERN stack, maybe what the community meant was just JavaScript Developers? Why didn't we use it? There is nothing wrong with that. If that is the case then, the term also denotes a bit of underestimation of the language as if it was a toy not ready for prime time. In any case, the fact that we could use the same programming language across the stack does not mean that one can be good at every stack layer.
Even within the context of the same programming language, the term fullstack developer is not very accurate.
Top comments (10)
I've been in this game professionally since before the terms 'backend', 'frontend', and 'full-stack' even existed. Everyone used to do a bit of everything (server, webserver, database, front end, back end, desktop apps, system utilities, etc.)
I've been writing code for 38 years, and doing it as a job for about 26. I just consider myself a developer. It's really best to not pigeonhole yourself. Learn and practice anything you find interesting - financial or career advancement should not be your goal/motivation.
It's testament to either: the declining skills of developers, or to the explosion of over-complexity that has gradually taken over and plagued our industry - that these roles have come into existence. I suspect the latter.
I have had rather heated discussions with supervisors on this very subject. Full-Stack is indeed almost universally just a label for (in my experience primarily frontend) developers who dabble in "the other side" just enough to get by. Backend developers tend to refuse work on the frontend - at most, they'll hack something in that just about works and then claim to know just how "easy" frontend is, and thus why backend is worth so much more. It's a silly notion and it reinforces some heavily unbalanced attitudes. Especially when, in turn, frontend developers are expected to do their own backend work because reasons.
To me, Full-Stack is an entirely different skillset, and specifically addresses two important architectural aspects: In-depth understanding of the communication between backend and frontend (i.e. front-facing API design, which a great lot of people are awful at), and in-depth knowledge of both frontend and backend. A full-stack developer is not just a "jack of all trades, master of none", they're sufficiently experienced in multiple parts of the stack equally. Juniors do exist, but they're approaching their education and training vastly differently from single-aspect juniors.
There's something to be said for experts in either backend or frontend, obviously, in that they may have an easier path to further depth in their field than full-stack developers, but the skillset they develop also requires such depth. Whereas full-stack developers tend to work shallower and instead have a skillset that more easily enables deep dives where needed.
In summary: Full-stack developers make better architects, but single-aspect developers make better programmers.
Specialization is healthy for our industry. The complexity of software is increasing rather than decreasing. It will be disastrous more often than not to naively believe a single skillset in a single person is sufficient for that complexity. Professional programmers should push back on the false idea of the full stack developer.
After ten rich years in the software industry, I've learned this:
About 95% of so-called fullstack developers are bad at CSS/HTML, and many of them treat this tech as a "bonus" or "nice to have".
In other words, only like 5% of fullstack developers are "genuine fullstack developers".
I agree with this part in particular.
Yes, I have full-stack on my resume and some of my profiles. But you'll notice that if I use the term at all, I almost always put either "full-stack web developer" or "full-stack JavaScript developer" because I am comfortable referring to myself as such while still keeping a straight face. That doesn't make me a "full-stack developer" nor a "full-stack engineer".
Quite frankly, I think web/JavaScript developers and software engineers are two completely different categories with minimal overlap. Some might disagree, but I don't consider myself a software engineer and I'm okay with that. It doesn't make me better or smarter than you and vice versa.
I think the T-model is probably the best learning approach overall, but it's also perfectly fine just to be exceptionally skilled in only one or two languages or frameworks. Some companies may pass you up, but eventually someone is going to notice your skill level and decide that it's worth it to hire you because you can outperform the others, even if it costs them a little more.
fullstack = backend + frontend. It doesn't mean he/she can do everything, but willing to work on both side. The label is stucked in the industry, especially the job market. So yeah, you have to accept that fullstack is not equivalent with the master of everything.
If your boss is over 40 he probably is a full stack developer. We bring specialized branchs to the software industry because we had no talents. In the 2000 we started to have specialized workers, to get them rapidly on the market. Instead of hiring ingeneers, we started hiring will stain low qualified, what we irocanicly call "a specialist". Enjoy the mc donald's inside ;-). The purpose of coding is just to make an hardware working. If you start telling mysql has to be studyed for 30 years, for sure you should review your skills...
I don't completely agree, there are devs working on smaller/simpler apps or sites and they just do "the whole thing" (backend and frontend), often even as a solo dev. Not every app or site needs a complicated frontend which requires a "specialist".
Maybe the term 'full stack dev' is overvalued and more glamorous sounding than it deserves, but I do like the (old fashioned?) idea of a competent dev who can "do the whole thing" including analysis/design, coding, testing up to rollout and even admin.
P.S. neither backend nor frontend are generally "easy" or simple in any shape or form, and with bigger and more complicated projects you're often indeed better off hiring specialists in either of the two areas
That is true, specially with founders who have to do the whole thing without the need of specialists to get going. I think in this context (and the one you advocate for) is more an attitude than an aptitude. With that, I agree.
It should be rather the t shaped model, for example I'm backend dev - I know and use ddd/even sourcing/cqrs (those terms are usually unknown on frontend or for js devs) on my backend side. But I can do the proof of concept with js, react etc. Am I a fullstack?