This is the first article in my "Lessons from UNIX History" series. By day I am a DevOps Tech Lead, by night a computing archeologist. This indulgence allows me to express my findings with practical, modern day applications. Ideally, we can all benefit at present by taking a moment to reflect on the past.
This is more than history, it is our heritage as engineers.
Human Tendencies
As humans, we love upgrades and new features. In fact, where would we be without them? The software industry relies on continuous delivery of new content; which is fine generally speaking. But what happens when we get too attached to a system, product or application? This results in the "second system effect".
Second System Effect
This term is best explained by Brian Kernighan in his 2020 book "UNIX A History and a Memoir":
"... it's tempting to try to create a new system that fixes all the remaining problems with the original while adding everybody's favorite new features too. The result is often a system that's too complicated, a consequence of taking on too many different things at the same time ..."
This excerpt explains the challenges we face in today's industry quite well. But we do well to analyze the history just a bit deeper.
Multics to Unics (UNIX)
We owe a surprising amount of features and applications to the UNIX predecessor "Multics". Such as:
- A shell
- User authentication
- Streams (output, redirection, file descriptors)
- Commands like "ls" and "date"
Multics was a time-sharing operating system that was both feature rich and yet inherently flawed. So much so that General Electric (GE) withdrew funding during the late stages of development. Business aside, it was clear that the OS was too bloated (sound familiar?). This is where Ken Thompson decided to branch off to new territory. We're very happy he did, because Unics (later renamed UNIX) was born.
No doubt, he had invested a lot of time in Multics. Yet, he was emotionally mature enough to "walk away" from a dying system full of bloat. He would embrace a new way of thinking; albeit at the cost of some very favored features.
But that was 50+ years ago!
It amazes me how much history repeats itself. Really, this is just as much of a problem today. Computers have gotten smaller. Our operating systems larger. Our knowledge vaster, but deep down we have the same tendencies as our predecessors. In fact, it could be argued with continuous delivery, our necessity for favorite features have gone up, not down.
What can you do about it?
Ken Thompson was open minded about the future. He recognized the need to abandon pet-projects when they were overloaded. We need to leave behind legacy features and not be afraid to go back to the drawing board. (I realize the irony in stating that lessons from a "legacy" era help us move forward technologically).
If you are in the DevOps / systems engineering world, be open minded to what is possible today. At times, I find myself leaning on base knowledge of AWS or Docker. Due to the saturation of our industry, what was "impossible yesterday" is certainly "possible today". Don't be afraid to forge your own path. You just might invent something for the next generation!
Next Article "Avoiding Tool Sprawl"
Today, we are bombarded with hundreds of tools, packages and middleware that promises to solve all your problems. Of course, I think we can all agree, that FoSS is wonderful. But how many tools do we really need? Have you noticed how many files go into the root directory of your code repos? In the next article I will tackle this real world problem, and what we learn from the past yet again.
Top comments (1)
I really liked the part about not being afraid of going back to the drawing board. Leaving legacy features may feel wrong to some, even sacrilegious. However, doing so creates more opportunities for different types of innovation. Which is really where the original creators of any system had to begin with. This doesn't mean that you lose your values by doing this. It only means you are applying them in clearer more updated way.