Recent news in Australia is that a country-wide test called NAPLAN encountered technical difficulties in its second iteration of allowing students to take the test online.
I was talking about it on Twitter and was trying to explain how that even if we had a bunch of techs/programmers sit together, brainstorm every possible edge case and test for them, there would still most likely be an edge case overlooked, sometimes due to imperfect information, other times due to resource constraints, etc.
The impression I get from non-tech people is that it is possible to account for every case, and to not to otherwise would otherwise signify incompetency.
Does anyone have a nice analogy/situation/scenario/explanation that gets that message across to enable them to be nicer and more empathetic to the difficulties associated with programming and anything tech-related?
Top comments (3)
The above scenario is something I constantly struggle with during my time as a vendor. And as counter-intuitive as it is the answer is rarely logical. But instead emotional.
After all the problem you are facing is ...
The thing is, especially for those outside tech, with the rate things are constantly changing. To them, it might as well be magic at times. And will somehow expect us, who claims to be a "pro" in it, to come up with a magical solution.
This still happens on a smaller scale, for things that are difficult to explain. Medical being a prominent example, where there are many patients who refuse to accept the doctor's advice of not having a "cure" to their problem.
So I digress, but once I came to terms with the above. I realize the best answer is at times not to answer emotions with logic. Which is something I admit I am not the best at, and is very case by case specific.
Another thing that I like to say personally is that
Especially for edge cases, that involve humans. My personal litmus test would be "does existing human solutions solve the problem"
Once we get the perspective of how expensive a human solution can be, we can give the technological solution a perspective of its cost. And the how non-practical it is.
That being said, as for NAPLAN specifically. Or for any other nationwide system that is extremely time sensitive (such as voting).
I am of the stance, that our systems should be built to be deployed as multiple self-isolated clusters, perhaps a server in each test center. After all internet can and may go down at any one location.
That being said, this could drive up cost drastically. And since I am way too disconnected from the topic. Perhaps there was a cost-benefit analysis done for this. And the solution that was decided on was to fallback onto pen and paper if things go wrong.
If thats the case, putting aside what the news media is reporting. Its the case of things going as planned (as per risk management, and contigency point of view).
As a mathematician I'm very accustomed to asking the 'whys' (since everything can be justified logically in maths), so I appreciate your additional insight - especially the Clarke quote!
I definitely wanted to give the NAPLAN techs the benefit of the doubt, mainly because I've noticed mainstream media tending to focus on and capitalise on mistakes for headlines, which pains me to some extent.
Maybe it's easier to shrug, say "Murphy's Law" and call it a day. Obviously I can't control how people end up perceiving tech as a result, but I sure don't want it to be made any worse! Maybe the answer is to normalise and get across the message that making mistakes in tech is normal. That's something I can definitely do!
What you suggested, on normalizing problems, would really help to get other to understand the problem, instead of demanding them to be magically solved.
It would definitely help the developers who are working on it, and who may have made mistakes on it - fix it. And if it's a resource constraint, perhaps lead to more resources needed for it.
Saying this as someone who's code has crashed live systems in active use before 😢