Blaine, I think your article is interesting, yet it is built on a straw man fallacy. Robert Martin is not saying that NASA is not (or was not) doing its job properly. The example he gives is :
one of the programmers had reused some code from a different platform and had not realized that it had a built-in 30 second truncation.
That is not something that can be fixed with a tool or by better requirements.
The example given in the article linked by Robert Martin is not about NASA. It's about the disastrous unintended acceleration in Toyota cars ("there were more than 10 million ways for key tasks on the on board computer to fail, potentially leading to unintended acceleration").
Once again this is a problem, you should be able to fix without adding "should not accelerate unintendedly" to the requirements.
I'm a small business programmer. I love solving tough problems with Python and PHP. If you like what you're seeing, you should probably follow me here on dev.to and then checkout my blog.
Thanks for reading and taking the time to comment, Pierre.
I think I made a pretty convincing case that software complexity in safety critical systems can exceed our ability to comprehend it.
I quoted a safety critical systems expert with a PhD who said the same thing (two experts actually). And I quoted a section of Mr Martin's blog post where he lays all the blame for the problems described in the Atlantic article on the programmers.
I don't believe I setup a straw man argument. It's a pretty simple disagreement but as you can see from the multitude of comments here that there are no simple solutions.
Cheers.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
Blaine, I think your article is interesting, yet it is built on a straw man fallacy. Robert Martin is not saying that NASA is not (or was not) doing its job properly. The example he gives is :
That is not something that can be fixed with a tool or by better requirements.
The example given in the article linked by Robert Martin is not about NASA. It's about the disastrous unintended acceleration in Toyota cars ("there were more than 10 million ways for key tasks on the on board computer to fail, potentially leading to unintended acceleration").
Once again this is a problem, you should be able to fix without adding "should not accelerate unintendedly" to the requirements.
Thanks for reading and taking the time to comment, Pierre.
I think I made a pretty convincing case that software complexity in safety critical systems can exceed our ability to comprehend it.
I quoted a safety critical systems expert with a PhD who said the same thing (two experts actually). And I quoted a section of Mr Martin's blog post where he lays all the blame for the problems described in the Atlantic article on the programmers.
I don't believe I setup a straw man argument. It's a pretty simple disagreement but as you can see from the multitude of comments here that there are no simple solutions.
Cheers.