I'm getting a head start on celebrating the two-decade anniversary of The Joel Test. If you haven't heard of Joel Spolsky, or otherwise live under a rock, he's the CEO of Stack Overflow and co-founder of Trello and Fog Creek Software (now Glitch). He's a hardened software veteran, with a background going back to the Microsoft Excel project in the early 90s.
He also hosts a very popular blog at Joel on Software, though his activity there has dwindled over the years. That's how I originally found him, early on when I first started programming, and I've been a huge fan ever since. It helps that he's a fantastic writer, articulate and insightful while still being fun and accessible. More importantly, I've always appreciated his values, in his attitude of respect for the craft of software development, the stakeholders of the companies he runs, and the members of the communities he builds.
So what's The Joel Test? If you didn't just read through that link, it's a 12-item, dead-simple checklist for evaluating the effectiveness of a software team. Hitting all 12 won't automatically make your team great, but hitting 10 or less is almost certainly going to cause problems. It's obviously not iron-clad, but it's a useful heuristic, and I've definitely used it during interviews - it's a great conversation starter if nothing else. So, two decades (in two years) later, what's changed? Let's walk through it:
1. Do you use source control?
This is just as relevant today, and much easier to attain. Git is the clear Goliath today. You can use other source control systems, but there's really no compelling reason to. Platforms like GitHub and Bitbucket make it practically effortless to maintain a production-grade source control system, so there's no good excuse for not taking advantage of the technology.
2. Can you make a build in one step?
Still true, and the field has moved forward here. With modern CI/CD systems like Travis, you can hook into your source control system to automatically build (and release!) in zero steps. This is more relevant in some fields (web) than others (embedded), but you should still definitely automate your build process - bonus points if it runs entirely without manual interaction.
3. Do you make daily builds?
Daily is good, but today it's easy to do even better. Again, with modern CI/CD systems, you can build on every commit before merging changes to your master branch. Broken builds mean downtime for unhappy developers, and have you ever spent time with unhappy developers on downtime? I have. You shouldn't. There's very little reason for the build to ever be broken.
4. Do you have a bug database?
JIRA is pretty standard here, but maybe heavyweight for a lot of teams. I really appreciate his point about creating a dead-stupid five column table - if a Google Sheets documents works for the scale of your team, then keep it simple! You'll know when you've outgrown it, and then there are plenty of more powerful bug tracking tools available for upgrade. The important point is to write down your bugs in one place, and make everyone go there.
5. Do you fix bugs before writing new code?
In general, this is good practice. I'd argue there are business models where time-to-market is critical, and letting small, especially cosmetic bugs slide is just plain economic.
But I've personally worked in +5 million legacy C/C++ codebases where this wasn't practiced, and I can assure you - keep it up long enough, and software behavior becomes so unpredictable and erratic in response to even minor changes that new development is practically impossible. Even when you can move forward, estimates are meaningless, schedules are unreliable - it's just a mess. If you want any meaningful level of predictability in your development schedule, you need a reliable foundation to build on.
6. Do you have an up-to-date schedule?
This one is tough. Personally, I don't believe that estimation is practical or reliable in software development, and insistence on deterministic and reliable estimation is counterproductive and misses the point. In software, if it's a solved problem, there's a library for it - use it. If it's not a solved problem, how do you expect to estimate how long it'll take?
Joel has another great post on this (there's always one!) - Evidence Based Scheduling. Basically, he recommends sizing upcoming work in detail relative to work completed in the past, and then projecting dates forward based on historical time-to-completion. But - and this is extremely important - the result is a projection, not a commitment, and takes the form of a range with attached confidence values, not a single date.
7. Do you have a spec?
Programmers still hate this, and it's still just as important. It's not that it's difficult, just that it involves two things developers tend to avoid: writing documentation and talking to people.
Before you start writing code, you should know what user problem you're trying to solve. If you know what user problem you're trying to solve, you should be able to quickly write out a few sentences describing it. Now, just put those sentences in a shareable doc - don't make it more complicated than it needs to be. Now share that with a team member, preferably someone closer to the users than you. What do they think about it? Revise until they're happy with it - now you can code! In practice, this cycle is low-overhead but extremely valuable. Make it a part of your normal workflow.
8. Do programmers have quiet working conditions?
Probably not, but they should. The trend towards open offices is a huge push in the wrong direction from a developer productivity standpoint. Joel's stance has always been "programmers get an office, with a door."
It doesn't help that the tech industry is concentrated in cities with (not unrelated) extremely high cost of living and real estate prices. Open floor plans are tempting to management for a reason. But if you look at the research, and do the math, the cost of interruptions to highly-skilled, well-compensated, intensive knowledge workers exceeds the cost to put up some walls and doors.
(As a practical aside: noise-isolating headphones help if you're a developer without a private office and aren't going to win that battle. I recommend the passive route - the classic Sennheiser HD 280 Pros are fantastic.)
9. Do you use the best tools money can buy?
Being successful in software development requires a certain inclination and level of ability that's just fundamentally rare. It's a meme in the industry that there's a "developer shortage." Whether you buy that or not, it's a fact that a huge number of software development jobs stay unfilled, often until companies give up and close the position. There just aren't enough people entering the industry to fulfill demand.
High demand, low supply means high market rates - the average salary for a senior software engineer in 2018 is upwards of $110k. You can reasonably double that to figure the cost-to-employer, including the overhead of benefits, administration, rent, utilities, equipment, etc. That means if you have a software engineer on staff, you're spending roughly $100 every hour just for them to show up!
If you can invest in some tooling that saves a single engineer even 10 minutes a day, that's $3000 a year, every year. It really doesn't take much to justify the expense.
10. Do you have testers?
The rise of automated testing has ironically reduced emphasis on this (which was never strong to begin with), but it's still a false economy. Ultimately, an actual human, who didn't write the software, and probably doesn't think like a software developer, is going to sit down and try to use it. They're going to immediately do something that the developer didn't anticipate, and the system will break in ways the implementer didn't foresee. That user isn't going to file a bug report - they're going to quietly close your app and Google for something else. If someone ever mentions your software, that's the story they're going to tell.
So either you don't test, and you lose users, or developers end up doing the testing, which is like having surgeons clean the bedpans. That way there's no need to hire additional nursing staff!
11. Do new candidates write code during their interview?
This is definitely still important, but the emphasis has shifted through automation by tools like Codility. Interview time spent on assessing technical skills is taken away from time assessing values and culture fit, which are likely better long-term indicators of candidate success. Pseudo-code and general problem solving are great to help assess how a candidate frames, deconstructs, and solves problems, but nuts-and-bolts coding without available reference material doesn't simulate any reasonable working condition in actual practice.
Definitely have candidates write and test actual code during the hiring process, but it might not be the most effective use of direct interview time given the tools available today.
12. Do you do hallway usability testing?
This was always a bit of a goofy one just due to the idiosyncratic wording. Whether or not you find it endearing, his core point that you need to watch people actually try to use your software will always be valid. The subpoint, that it doesn't have to be a heavy formal process, is also extremely relevant. Grab the person sitting next to you, sit them down in front of your welcome screen, and ask them to complete some specific task.
If you aren't already doing this regularly, or something similar (e.g. systems to automate this with screen capture), you'll likely be surprised at how quickly issues surface, and how consistent those issues are across testers.
Addendum: 13. Do you do automated testing?
I saw Joel speak at a corporate event a few years back, and someone asked him what he would change about the The Joel Test given recent developments in the field. The point that sticks out in memory is that he'd likely add something about automated testing - not necessarily TDD, or any heavy formal process (notice the theme?) - just that you're thinking about and implementing strategies to automate your testing efforts where feasible. I'm totally on board with this, so I'm throwing it in as an informal #13.
Where does this leave us?
Overall, Joel's original 12 items have stood up well over the past two decades, which is actually an extremely long time in software development. Still, there have definitely been relevant changes that necessitate some tweaks. When I use the The Joel Test in the future, I'm going to use my own revised version:
- Do you use Git, or some lesser source control system?
- Can you build and release in one step?
- Do you build and test before merging to master?
- Do you have a bug database?
- Do you fix bugs before writing new code?
- Do you have an up-to-date schedule?
- Do you write a spec before writing code?
- Do programmers have quiet working conditions free of interruptions?
- Do you use the best development tools money can buy?
- Do you have human testers?
- Do you do automated testing?
- Do new candidates write code as part of the hiring process?
- Do you watch people actually try to use your software?
This post was originally published to CheckGit I/O.
Top comments (3)
Yeowch - I get the points about relying solely on automated testing but this one statement I really take issue with. Having worked as in QA/QE and in busy Operations, it really irks me that this is how Developers view their role as too-good-to-do-the-dirty work and consider the folks who pick up the excuse-my-french as beneath them. What's worse - most management seem to set this up as acceptable and perpetuate it.
Even in a setting with manual testers, the right thing to do is put everyone on the same stead and shift-left the testing concerns so QA are at the table when development is scoped out and the overhead is distributed with Developers doing the responsible upfront testing (TDD/BDD, Code Coverage, Unit/Integration and in cases Scale/Systems/E2E testing as well) and QA "assuring" Quality (Validation/Verification, Exploratory/Ad-Hoc Testing, Hard-to-automate testing).
This is so undeniably true but manual QA isn't the escape hatch to the problem however - that leads to an upside down test pyramid and really slow delivery and grumpy engineers all over. Both automated and manual testing still have a place with sensible division of QA responsibilities across Dev and Manual Testers (arguably Ops as well if you buy into SRE/CRE).
I'm totally on board with what you just said. I am a developer but I don't think QA is a lesser job than a developer. A good QA is at least as valuable as a good developer and a bad QA is as dangerous as a bad developer.
Full disclosure: I'm a software development manager at a large company.