These days, it seems like we have two fundamental choices in development methodologies. Either you are "agile", (whatever that means), or "waterfall" (apparently means "not agile").
For my purposes here, "agile" processes are those designed to accept and embrace change during development.
When I say "waterfall" here, I mean any process that attempts to create a project by following a strictly ordered sequence of steps, where each step must conclude before the following one can begin. For example, only after requirements are written can development start.
I'm not going to get in to who is agile and who is not, nor which named processes and certifications are the most or least agile. All of that has been covered, ad nauseam maximum, elsewhere. The point I want to make here is that agile is the method we adopt when we don't know what we're building.
I don't mean that agile is for idiots, or the uninformed, or the backward or anything like that. Agile ideas are great and have brought a lot of sanity and value to the software development world. I like agile, really!
Take a moment to look at the Twelve Principles of Agile Software articulated by the authors of the Agile Manifesto. Many of those principles are about figuring out what to build ("harness change", "business people and developers must work together daily", etc.) and making sure you are building the right thing ("working software is the primary measure of progress", "early and continuous delivery of valuable software"). The rest of the principles are about the actual building of the product ("sustainable development", "self-organizing teams", etc.).
These principles describe a development philosophy that is as much about discovery as it is about development. Agile accepts and embraces the fact that what the business says it wants is probably not complete or accurate. One of the great innovations recognized in these principles is that the business absolutely needs to be involved throughout the development process. This is important because, for one thing, we're so bad at communication that talking very frequently about very small pieces helps avoid I/O errors. But more importantly, working together in these tight iterative cycles helps the business discover what it really wants. Agile has taught us how to discover and build at the same time.
Really, that's pretty brilliant. However, if we knew what we were building in the first place, we wouldn't have to discover it along the way. In other words, agile is what we do when we don't know what we're doing.
Let's talk about waterfall for a minute. Why do we hate it so much? In theory, the business has a need, which it defines as a set of software requirements. Those requirements are handed off to the developers who implement them. The tested result is given back to the business, and a small team of devs is assigned to fix, maintain and enhance the product into the future.
On one level, it does seem like it would be nice to be handed a stack of requirements, lock yourself in your code dungeon for a couple weeks and emerge with a finished product without having to talk to anyone but fellow nerds and the pizza delivery guy for the duration. Well, it would be nice if you knew that the requirements were going to make sense, that they were not going to change, and that the thing you were building was going to be useful to anyone and see the light of the production day.
The problem, as we have already suggested, is that the business rarely actually knows what it needs. Since knowing the business need is the first step in a long, serial process, getting that part wrong results in the slow-motion train wreck with which so many of us are familiar.
In truth, waterfall can work well, but it depends heavily on ideal conditions. It only works when you know exactly what you are doing.
I think the reason that we hate waterfall so much is not that the methodology is inherently flawed, but because we and our businesses are so bad at planning and communication. The waterfall methodology, then, becomes the "denial" methodology, in which we delude ourselves into thinking we know what is needed and how to build it, deferring uncomfortable realities until release (which is probably very late, if it happens at all).
For the reality in which most of us live, which is largely unplanned and sometimes unplannable, the agile philosophy makes much more sense.
Why are agile processes so tightly associated with software? Sure, they are inspired by meat-space practices like the Toyota Production System, but a Corolla is still built in a waterfall process. It is designed, then built, then shipped, and finally sold. You can't afford to discover that there's too little back seat leg room after the body has been stamped and welded. TPS is about being flexible with the car production process, not with the cars themselves.
I think we can get away with agile processes in software because our wares are soft and more easily modified or scrapped and rebuilt. I wonder if agile, as cool as the idea is, isn't a concession to broken business habits; a work-around for a bug in an imported library, not itself the truly correct solution.
Take a Pixar movie, for example. I don't work at Pixar, but I would imagine that their productions must essentially follow a waterfall-like process. Even so, they make such great movies because they invest so much time, effort, and expertise in the very front of the process. Because modeling, animation, and lighting are relatively expensive, hashing on the story (the definition of the movie) happens on low-fi storyboards. Think about it - you need the animation before you can render it, you need the voices recorded before you do the animation, you need the script before you record voices, and you need the story before you can write the script. By the time rendering has started, it's too late to think up Riley's "tragic vampire romance island". Films don't have the luxury of discovering themselves all along the way. They do have mid-stream corrections sometimes, but these are expensive.
Consider the mighty Saturn V rocket, the major components of which were built by no fewer than five separate contractors. While each contractor had some latitude on implementation of its assigned components, a decision that changed the mass of an engine or height of a propellant tank had implications that rippled across the whole, mind-numbingly complex project and would have to be communicated to hundreds of people with potentially thousands of physical pieces "refactored" as a result. There's no way you'd get away with showing up to a stand-up on the production floor with half of an S-IVB assembled behind you and say, "yesterday I tried 7075-T6 aluminum alloy for the forward skirt, but today I think I'm going to see how a 2319 does." These things have to be planned, to excruciating detail, before anything gets built.
What if the software industry figured out how to invest the best of our efforts in the very front of the process. What if the software engineers and the business people sat down together and discovered needs and requirements together in a medium even softer than code? Is there something special about software that prevents this?
I've been thinking about an alternate method, which I'll call the tree process. The basic concept is a series of recursive passes breaking parts of an idea into consecutively smaller pieces. Start with the root idea: is it viable? Does it provide value? What are the pieces it is made from? Are those component pieces viable? What smaller pieces would be used to make them?
Lather, rinse, repeat.
As you go, you will generate a tree structure that starts with the original idea as the root, and branching off from it the things necessary to make it work. Each node breaks down into its own requirements, until, at the bottom, you are one layer away from the actual implementation. Along the way you will identify and test assumptions -- "will customers find this feature useful?", or "what are the performance implications of this algorithm?". Through this exercise you should have answered pretty much all of the questions, and the tree tells you exactly how everything fits together. Development then starts at the leaves, resolving each node up the tree until the final product, the root idea, is finished.
During development you might learn things that didn't come up in planning, and that will be fine. Just go back to the tree and make the necessary adjustments. Because of careful advance planning, the effects of these changes should be limited and the tree will show you exactly what the ripple effects will be.
In my experience, relatively little of what we do as developers is actually writing code. Most of our mental effort is figuring out what we are building and how we are going to do it. The code is the easy part. What if we and our business colleagues could muster the discipline to front-load all of that effort? Then you really could lock yourself away for two weeks and not only know exactly what you are building, but also why it's important.
What do you think? Does this tree idea make sense? Or, is agile really the ideal process, with other engineering disciplines helplessly limited by physics or tooling? What other successful strategies have you seen?