Robert C. Martin (Uncle Bob) has been banging on the "software professionalism" drum for years and I've been nodding my head every with every beat. As a profession, I believe software developers need to up their game big time. However, I've become concerned about Uncle Bob's approach. I reached my breaking point the other day when I read his blog post titled Tools are not the Answer.
He took issue with a recent Atlantic article titledÂ The Coming Software Apocalypse.Â Let me see if I can summarize the theses of these two articles.
We are writing more and more software for safety-critical applications and the software has become so complex that programmers are unable to exhaustively test orÂ comprehendÂ all the possible inputs, states, and interactions that the software can experience. We are attempting to build systems that are beyond our ability to intellectually manage.
We need new ways of helping software developers write software that functions correctly (and is safe) in the face of all this complexity. The current methods of producingÂ safety-critical software are especially dangerous to society because when software contains defects we can't observe them in the same way we can observe that a tire is flat--they're invisible.
- Too many programmer (sic) take sloppy short-cuts under schedule pressure.
- Too many other programmers think it’s fine, and provide cover.
And theÂ obviousÂ solution:
- Raise the level of software discipline and professionalism.
- Never make excuses for sloppy work.
Safety-critical software systems, which are the topic of the Atlantic article, are held to shockingly high quality standards. The kind of requirements analysis,Â planning, design, coding, testing, documentation, verification, and regulatoryÂ complianceÂ that goes into these systems is miles beyond what any normal organization would consider for an e-commerce website or mobile app, for example.
Read They Write the Right Stuff and tell me if you think Uncle Bob's on the right track (note this article was written 21 years ago and the state-of-the-art has advanced significantly). Does it sound like the NASA programmers just need more discipline and professionalism coupled with never makingÂ excusesÂ for sloppy work?
Dr Nancy LevesonÂ was quoted several times in the Atlantic article but Uncle Bob completely ignored those parts.
So let's review an excerpt from one of her talks:
I've been doing this for thirty-six years. I've read hundreds of accident reports and many of them have software in them. And every someone (sic) that software was related, it was a requirements problem. It was not a coding problem. So that's the first really important thing. Everybody's working on coding and testing and they're not working on the requirements, which is the problem. (emphasis added)
She can't say it much clearer than that. Did I mention that she's an expert? Did I mention that she works on all kinds of important projects, including classified government weapons programs?
In his paper Safety Critical Systems: Challenges and Directions, Dr Knight describes many challenges of building safety-critical systems but developer discipline and professionalism are not among them. This is as close as he gets:
Development time and effort for safety-critical systems are so extreme with present technology that building the systems that will be demanded in the future will not be possible in many cases. Any new software technology in this field must address both the cost and time issues. The challenge here is daunting because a reduction of a few percent is not going to make much of an impact. Something like an order of magnitude is required.
Developing safety-critical systems is extremely slow, which adds to cost. But QA practices virtually ensure delivered software functions as specified in the requirements. Uncle Bob could possibly argue that some projects are slow because the developers on those projects are undisciplined and unprofessional. But a claim like that requires evidence and Uncle Bob offers none.
My goodness, we need more and better tools. When I first started programming, I started with a text editor with basic syntax highlighting, that's it. I used to FTP into the production server to upload my code and run it; I didn't have a development environment.
Later I moved to Eclipse and thought I was stupid for not doing this sooner. Eclipse caught all kinds of errors I missed with the basic text editor. It just highlighted them like a misspelled word in a word processor--brilliant.
A couple of years later I adopted Subversion as my VCS and I thought I was stupid for not doing this sooner. I could see all the history for my project, I could make changes and revert them. It was awesome.
- code reviews/pull requests/Jira
- advanced IDEs with integrated static analysis, automated refactoring tools,Â automatic code formatting, and unit tests that run at the push of a button
- property-based testing (QuickCheck family)
- virtual machines
- open source libraries
It's been nearly twenty years since I started programming and my tools have changed significantly in that time. I can only imagine how the tools that become available in the next twenty years will change how we write and deliver code.
Let's look at some possibilities.
My static analyzers still don't understand my code and can only pick up simple mistakes. They flag tons of false positives. They can be slow on large code bases. And I'd love it if I just have one static analyzer that did everything I wanted instead of 4-5. It's also time consuming to write custom rules. There's plenty of room for improvement there.
Then there are "correct byÂ construction" techniques. I watched this video. He had me at "a provable absence of runtime errors". So I got a book on Spark (a subset of Ada) and started learning. Wow, you might be able to write highly reliable and correct software in Spark but it's going to be a slow process (aka expensive).
Is this the future? I don't know but maybe if it was easier to program in Spark it might have a better chance in safety-critical software circles. It would also be interesting if someone developed formal method capabilities for my favorite programming language that were accurate and easy to use. "No need to write tests for this module, the prover says it'sÂ mathematically sound," yes please.
Software to track each requirement to the code that implements it and the tests that prove that it was implemented correctly
I watched a video where the presenter was talking about the difficulty her team has with tracking thousands of requirements to specific code and test cases and back for regulatory compliance purposes in safety-critical systems. The task became much more difficult as they tried to keep everything in sync while the requirements, tests, and code changed as the project progressed. That team and every team like them needs better tools. And, eventually, I'd love to see that kind of thing built into the IDE for my favorite programming language, if it was easy to use.
Then there are formal specification languages to consider. The Atlantic article mentions TLA+ but there are others. Now imagine that these languages were easy to use. Imagine that you had a tool that could help you construct a formal specification in an iterative way where it coached you to along to make sure you covered every case. And when you were done, you could get it to generate some or all of the code for you. Plus, if you got stuck you could just find the answer on StackOverflow. Cool? Hell, yes!
I'm sure we can brainstorm dozens of new or improved tools in the comments that would help us write better, more correct code at a lower cost.
The fundamental problem is that even the brightest among us don't have the intellectual capacity to understand and reason about all the things that could happen in the complex interacting systems we are trying to build. It's not an issue of discipline or professionalism. These system can express emergent behavior or behave correctly but in ways unforeseeable by their designers.
That's why Dr Leveson's book is so important. Instead of trying to figure out all those states andÂ behaviorsÂ we "just" have to specify the states and behaviors that are not safe and prevent the software from getting into those states. Well, it's more complicated than that but that's a part of it.
I'm all for increasing software professionalism and discipline but Uncle Bob's wrong about how to prevent "The Coming SoftwareÂ Apocalypse" in safety-critical software systems. Experts in the field don't rank programmer professionalism and discipline anywhere near the top of their priorities for preventing loses.
More programmer discipline and professionalism can't hurt but we also need ways of taming complexity, better tools, ways to increase our productivity, ways to reason about emergent behavior, research on what actually works for developing safety-critical software systems, new and better techniques for all aspects of the software development process, especially better ways of getting the requirements right, and so much more.
I know there are tons of programmers churning out low-quality code. But organizations building safety-critical systems have processes in place to prevent the vast majority of that code from making it into their systems. So if the software apocalypse comes to pass you can be pretty sure it won't be because some programmer thought he could take a short-cut and get away with it.
What do you think? Agree or disagree? I'd love to hear your thoughts.
Here's a video of Uncle Bob's software professionalism talk:Â https://youtu.be/BSaAMQVq01E
Nancy Leveson's book Engineering a Safer World is so important that she released it in its entirety for free:Â https://www.dropbox.com/s/dwl3782mc6fcjih/8179.pdf?dl=0
Excellent video on safety-critical systems:Â https://youtu.be/E0igfLcilSk
Excellent video on "correct by construction" techniques:Â https://youtu.be/03mUs5NlT6U