DEV Community

harsh banthiya
harsh banthiya

Posted on

Always revisit this page when in doubt; The Basics of UNIX philosophy.

The basics of Unix Philosophy

It all began with Thompson's efforts on how to design a small but capable operating system with a clean interface.

The Unix philosophy is bottom up, not top down.

  • Doug McIlroy ( the inventor of Unix Pipes ) said:
    • Make each program do one thing well. To do a new job, build afresh rather than trying to complicate old programs.
    • Design and build software to be tried early, don't hesitate to throw away clumsy parts and rebuild them.
    • Use tools in preference to unskilled help to lighten programming task. Even if you have to detour to build the tools and expect to throw some of them out after you've finished them.

This is the Unix philosophy, write programs that do one thing and they do it well. Write programs to work well together. Write programs to handle text streams, because that is the universal interface.

Rules for the Unix tribe, from Rob Pike (Notes on C Programming)

  1. You cannot tell where a program is going to spend its time. Bottlenecks occur in surprising places, so don't try to second guess and put in a speed hack until you've proven that's where the bottleneck is.
  2. Measure. Don't tune for speed until you've measured, and even then don't unless one part of the code overwhelms the rest.
  3. Fancy algorithms are slow when n is small, n is usually small. Fancy algorithms have big constants. Until you know that n is frequently going to be big, don't get fancy. (even when n get big, use Rule 2 first)
  4. Fancy algorithms are buggier than simple ones, and they're much harder to implement. Use simple algorithms and simple data structures.
  5. Data dominates. If you've chosen the right data structures and organized things well, the algorithms are almost always self evident. Data structures and not algorithms are central to programming.
  6. There is no Rule 6.

Similarly there are a few Unix Philosophy implied rules, let's look at them.

Rule of Modularity: Write simple parts connected by clean interface.

"Controlling complexity is the essence of Computer Programming", Debugging dominated development time and getting a working system out of the door is usually less a result of brilliant design than it is of managing not to trip over your own feet too many times.

Rule of Clarity: Clarity is better than cleverness.

Because maintenance is so important and expensive, write programs as if the most important communication they do is not with the computer that executes them but to the humans who will read and maintain the source code in the future (including you).

Thats doesn't just mean comments, it also embraces choosing your algorithms and implementations for future maintainability. Buying a small increase in performance with a large increase in complexity and obscurity of your technique is a bad trade.

Rule of Composition: Design programs to be connected with other programs.

Unix tradition strongly encourages writing programs that read and write simple, textual, stream-oriented, device independent formats. Under classic Unix, as many programs as possible are written as simple filters, which take a simple text stream on input and process it into another simple text stream on output.

To make programs composable, make them independent. A program on one end of a text stream should care as little as possible about the program on the other end. It should be easy to replace one end with a completely different implementation without disturbing the other.

Rule of Separation: Separate policy from mechanism; separate interfaces from engines.

Policies are ways to choose which activities to perform, Mechanisms are implementations that enforce policies. Policy and Mechanisms mutate on different timescales, with policy changing much faster than mechanism.
Thus, hardwiring policy and mechanism together has two bad effects; it makes policy rigid and harder to change in response to user requirements, and it means that trying to change policy has a strong tendency to destabilize the mechanisms.

Rule for Simplicity: Design for simplicity, add complexity only where you must.

No need for technical machismo. Programmers are bright people who are (often justly) proud of their ability to handle complexity and juggle abstractions. Often they compete with their peers to see who can build the most intricate and beautiful complexities. Their ability to design outstrips their ability to implement and debug and the result is an expensive failure.

Rule of Parsimony: Write a big problem only when it is clear by demonstration that nothing else will do.

'Big' here has the sense of both large in volume of code and of internal complexity. Allowing programs to get large hurts maintainability. Because people are reluctant to throw away the visible product of lots of work, large programs invite over-investment in approaches that are failed or suboptimal.

Rule of Transparency: Design for visibility to make inspection and debugging easier.

Debugging occupied three quarters or more of our development time. A system is transparent when you can look at it and immediately understand what it is doing and how. It is discoverable when it has facilities for monitoring and display of internal state so that your program not only functions well but can be seen to function well.

Designing these qualities have implications from the very beginning. At minimum, it implies that it must use simple interfaces, that can be easily manipulated by other programs - in particular, test and monitoring harnesses and debugging scripts.

Rule of Robustness: Robustness is the child of transparency and simplicity.

Software is robust when it performs well under unexpected conditions which stress the designer's assumptions, as well as normal conditions.

One very important tactic for being robust under odd inputs is to avoid having special cases in your code. Here is another way to make simple and robust software.

Rule of Representation: Fold knowledge into data, so program logic can be stupid and robust.

Even the simplest procedural logic is hard for humans to verify. Data is more tractable than program logic. It follows that where you see a choice between complexity in data structures and complexity in code, choose the former. In evolving a design, you should actively seek ways to shift complexity from code to data.

Rule of Least Surprise: In interface design, always do the least surprising thing.

The easiest programs to use are those that demand the least new learning from the user, or to put in another way, the easiest programs to use are those that most effectively connect the user's pre-existing knowledge. Well designed programs treat the users attention and concentration as a precious and limited resource, only to be claimed when necessary.

Rule of Repair: Repair what you can - but when you must fail, fail noisily and as soon as possible.

Yes it is best when software can cope with unexpected conditions by adapting to them, but the worst kind of bugs are those in which repair doesn't succeed and the problem quietly causes corruption that doesn't show up until later.

"Be liberal in what you accept, but conservative in what you send." Well-Designed programs co-operate with other programs by making as much sense as they can from ill-formed inputs; they either fail noisily or pass strictly clean and correct data to the next program in the chain.

Rule of Economy: Programmer time is expensive; conserve it in preference to machine time.

In older days, machines were slower and much expensive; Now however we can make use of cheap machine cycles now. Code in higher level languages, ease programmer's burden by letting languages do their own memory management.

The obvious way to conserve programmer time is to teach machines how to do more of the low level work of programming.

Rule of Generation: Avoid hand-hacking; write programs to write programs when you can.

Human beings are notoriously bad at sweating the details. Any kind of hand hacking is rich source of delays and errors. Generated code (at every level) is almost always cheaper and more reliable than hand-hacked. That's why we have compilers and interpreters. It pays to use code generators when they can raise the level of abstractions - that is when the specification language for the generator is simpler than the generated code.

Rule of Optimization: Prototype before polishing. Get it working before you optimize it.

"90% of functionality delivered now is better than 100% of it delivered never"; premature optimization before the bottlenecks are known is only error to have ruined more designs than feature creep. Often local premature optimization, hinders global optimization. "Make it run, then Make it run right, then make it fast."

Rule of Diversity: Distrust all claims for "one true way"

Even the best software tools tend to be limited by the imaginations of their designers. Nobody is smart enough to optimize for everything, nor to anticipate all the uses to which their software might be put. Embrace multiple languages, open extensible systems and customization hooks everywhere.

Rule of Extensibility: Design for future, because it will be here sooner than you think.

If it is unwise to trust other people's claims for "one true way", it is even more foolish to believe them about your own designs. Never assume you have the final answer. Therefore, leave room for your data formats and code to grow, otherwise you are locked with unwise early choices because you cannot change them while maintaining backward compatibility.

When you design protocols or file formats, make them sufficiently self describing to be extensible. Always, always either include a version number, or compose the format from self contained, self describing clauses in such a way that new clauses can be readily added and old ones dropped without confusing format reading code.

When you design code, organize it so future developers will be able to plug new functions into the architecture without having to scrap and rebuild the architecture. Put in the "if you ever need to ... " comments.

When you design for the future, the sanity that you save may be your own.

Discussion (1)

Collapse
rdentato profile image
Remo Dentato

Thanks for reminding us those undisputable truths!
I feel nowadays these concepts are too often forgotten.
Unix success and longevity in the real world is a testimony that those early “rules” were sound and lead to successful development.