The latest State of JS survey results are out. As always, survey results need to be taken with a grain of salt. There is always a bit of selection bias involved in these sorts of surveys whereby certain groups tend to be far more likely to respond. These concerns are somewhat reinforced by the survey's own reporting wherein almost 70% of respondents came from 3 sources.
As another example of this, the survey was 91.3% male. While people who identify as women or non-binary are severely underrepresented in our industry, the latest information would put the percentage more likely in the 15-17% range.
All those caveats aside, this is the largest survey focused exclusively on JavaScript, with 21,717 responses, so it can be interesting to parse the results and see how they align with your own opinions and perceptions of the community. Not a ton surprised me this year, but here are some somewhat random things that stood out to me when reading it.
For a very different take, check out this post by Jerod Santo.
We Overstate Our Expertise
The survey does not seem to have asked people to state their JavaScript proficiency, but, considering the target audience, it's probably safe to assume they lean advanced or expert with JavaScript. So it is a bit surprising that 56.4% of respondents consider themselves to be either advanced or expert in CSS, including about 40% saying they are a CSS expert.
In addition, 64.9% say they are advanced or expert in back-end, though trending slightly towards advanced over expert.
These results would indicate that a majority of respondents likely see themselves as advanced or expert in JavaScript, CSS and backend development. The survey laid out pretty high standards for these definitions (as seen in the images above). Even accounting for just over 50% of respondents having over 5 years of experience with JavaScript (which, for the record also seems unusually high), color me extremely dubious.
Rankings? π€
The survey displays a section it calls "rankings" for frameworks. The way this is displayed shows Vue (87%), Svelte (88%) and React (89%) sitting almost even for frontend frameworks.
This struck me as odd. Sure, Svelte has had a lot of momentum lately, but having it ranked almost tied with React, above Vue and well above Angular seemed off. However, the problem wasn't the data here so much as the terminology and the choice of how to display it. I think it can lead to misunderstandings, as it did initially with me.
The results above are only for a "satisfaction" ratio. There is a menu of options that, at least to me, wasn't initially obvious that allows you to switch to interest and awareness ratios. I believe the choice of "rankings" for the heading was chosen because these stats were grouped together, but I think it only compounds the initial confusion and potential misinterpretation.
Once I understood the way this was displayed, there were few surprises in the results. Same for back end frameworks.
Perhaps the only surprise was the popularity of Next.js and how quickly Meteor has fallen out of favor. In fact, my biggest surprise was in the mobile and desktop rankings.
NativeScript isn't even on the list. Perhaps I have a bias there myself since I worked at the company that makes it, but the other tools results seem to show it was a major missed inclusion as were others including, arguably, PWA even if it encompasses a range of tool solutions. Flutter may have been a big miss as well since the target audience seems to be partly JavaScript developers as it's not like there's a State of Dart survey.
Where Do We Go to Learn?
As someone who focuses on creating developer content, it's always interesting to me to see where developers are going to learn and keep up with their field. CSS Tricks has a substantial lead over everyone else with Dev.to coming in second. I was a bit surprised to see both beating out JavaScript Weekly as getting a top link in that newsletter seems to bring in large amounts of traffic, but maybe folks think of it as more of a secondary source since the content resides elsewhere.
Medium received a lot of votes in the freeform answers, even despite the dreaded paywall. I was also surprised that almost 20% still consult W3Schools, barely trailing MDN which is a far better resource. There are lots of folks that seem to be using Udemy, Egghead.io and FrontEndMasters. That doesn't surprise me, but no mention of Pluralsight at all? That does.
Opinions on JavaScript
Most of the data in the opinions section didn't surprise me. Folks seem to think things are headed in the right direction, though they feel less strongly about it than in years prior. I was a little surprised that most respondents do not agree that building JavaScript apps has gotten too complex now - only 40.3% either agree or strongly agree.
I thought the percentage would be higher. But I suppose we've already learned that a big chunk of respondents area apparently experts in everything related to the web, so maybe I shouldn't have been surprised.
Notably, the percentage of folks who think JavaScript is changing too fast has dropped, even though technically the language changes every year now. This doesn't terribly surprise me. ES6 was a major shift that took folks time to adjust to. However, recent changes are much less dramatic. I also feel as though the sense that there is a new framework every week has cooled.
What to Make of It?
It is fun to delve into these and, despite any complaints, am grateful for the folks who put this together. It is a lot of work. It can be useful to challenge some assumptions you may have, learn about new technologies you perhaps hadn't heard of and try to pick up on trends. However, I don't think there is anything in here that should cause anyone to make major changes to the way they do things or the tools that they use.
Top comments (2)
Great write-up! But:
I would love to see better sources for that range. The article you linked to seems focused on American companies, and even more on Silicon Valley companies β while our survey targets the whole world.
The Stack Overflow developer survey for example has almost exactly the same gender breakdown as we do:
insights.stackoverflow.com/survey/...
Calling out the gender imbalance revealed by our survey (or Stack Overflow's) probably stems from a good intention, but I find it a bit misleading to imply that the problem lies with the survey itself when βas far as I can tellβ it's most likely a reflection of the very real state of things.
Thanks for writing Sacha and thank you for all your hard work on this survey. I really do appreciate it and think the information is valuable.
First, to make it clearer. You have a large sample size for the community you are surveying and have done an excellent job of outreach. I am only pointing out some potential biases in the response so that people look at responses with a healthy skepticism. You are right that other developer surveys have the same issues. I think it can be tough for developer surveys to reach beyond those audiences that follow social media or blogs closely, and the responses, to me, seem to reflect that slightly.
As for the gender imbalance. There aren't great numbers but I have looked into this for a number of posts I've written in the past. Yes, the SO survey, which is a larger sample, has a larger imbalance. Here's where I get my guesstimate from. The percentage of women getting a CS degree sits somewhere between the upper teens to the mid-20% (depending on which data you look at). The percentage at large tech companies that release this data is generally in the mid-to-upper teens. Some studies that are admittedly a handful of years old have the percentage around 20%. This study, for instance, from 2016 has the percentage at 24% qz.com/814017/the-percentage-of-co...
These data points tend to lead me to believe (though I have not seen reliable data on this) that the percentage is probably in the mid-teens. Again, I only mention this because I think that people need to consider potential biases in the data - regardless of the survey (not because of any specific criticism of your survey).