Waterfall Works!

When I'm providing training or giving a talk on Agile Software Development, I love to shock the attendees with the following statement:
Waterfall works!
Gasps of disbelief abound... "WTF?!  This guy who has just described how he has been working with Agile for over a decade is telling me that Waterfall works?!"

The truth, however, is undeniable.  Tens and probably hundreds of thousands of software systems and products have been shipped and used over the last 40+ years since Winston Royce's paper Managing the Development of Large Software Systems was published.  That process has been applied to everything from modern web-based applications to massive telecom projects, from tiny programs built by one person in a few weeks to those with 10's of millions of lines of code built by hundreds of people over multiple years.

We simply cannot say that Waterfall doesn't work.  The catch is, though, that it's at best a sub-optimal way to deliver systems, especially now in the 21st century.  When I present this to people, though, I frame it in the context of when Dr. Royce wrote his paper and presented it at the IEEE Wescon conference.

First, if you actually read the paper you'll notice the classic waterfall model on the top of page 2 showing serial steps with each being fully completed before the next starts.

If you read the very first sentence of the very first paragraph after that figure you will see this:
I believe in this concept, but the implementation described above is risky and invites failure.
So, Dr. Royce actually knew that the process adopted by so many people was flawed from the start!  He goes on in the paper to show an iterative model that would be quite familiar to anyone in the Agile community:

Too bad most people didn't read past page 2 of the paper!

The second point I make is that we need to take into consideration when Dr. Royce presented this paper.  This occurred at the IEEE Wescon conference in August 1970.  I was a month away from starting Kindergarten then, and Ron Jeffries was still in his first decade of programming professionally. ;)

Let's also consider the computing environment in 1970.  A mainstream IBM System/360 in 1970 had a single processor capable of up to 0.034 MIPS (34 KIPS), a theoretical maximum of 16MB of memory (typically 256KB main memory and 8MB secondary), 225MB disk space.  It cost about $50K per month (in 1970 dollars!) to lease, and about $15 million to buy one.

Compare that to my mainstream (2011) Android phone which has a processor running at 740 MIPS, 384 MB RAM and 16GB SD secondary storage.  It's retail price was $349.

We also need to consider the programming environment in 1970.  Most work was done with punch cards, and it could take hours to determine if a large card deck had even compiled let alone ran, let alone worked properly.  Given that environment, of course you're going to spend a lot of time up front designing, writing and reviewing the code before creating the cards and before submitting them.
Contrast that with contemporary IDE's such as Eclipse, Intellij, Visual Studio, XCode, etc.  Those tools usually have a preference setting for how long they should wait before highlighting a syntax error or compiler warning!  We have tools and frameworks for unit and acceptance testing that can execute thousands of tests per second on our desktop machines.  We have automated tools to check for problems in our designs, and automated tools to safely refactor our code to improve the design incrementally.

The final point I try to make is that, at the time he wrote the paper, Dr. Royce was working with the IBM Federal Systems Division on projects for the U.S. Department of Defence.  These projects were typically very large, hence the name of the paper, and the contracting model employed treated the development of software the same as the construction of a building.  Considering the tools and development environment in 1970, that analogy was at that time much closer to the truth than it is now, but it was still a flawed view of software development.  We know now that software development is a design activity not unlike the work that architects do in designing a building, and aerospace engineers do in designing an aircraft.  Both of those domains use many iterations to create and refine the designs prior to the actual construction work.  The key difference is that software development is almost entirely a design effort, the construction aspect being compilation, linking and deployment.

So, yes, the waterfall model works - you can deliver software that way.  Our view of how software is developed, however, has refined over the decades and the dizzying pace of technological advances has enabled us to move to more iterative, incremental approaches to delivering systems.

In the end, it's not that waterfall doesn't work, it's that we no longer need it.


Paul said…
I'm an old-fashioned guy. There are times when I look at new & high technology and wonder if we really are better off with the new shiny toys and 'improved' ways of doing things.

As a tester, for example, there's always a push to find better ways to automate testing/checking activities. I have found, however, that the good old Scientific Method and direct/personal interaction with the project team members (and customers, when possible) provides me with the most success. You can't automate that kind of interaction and investigation. There are some kinds of checking activities that computers can clearly help us with, and we should definitely make use of them for these tasks. I often find that there is too much "snake oil" in the new testing tools that are being marketed out there today.

On the other hand, thinking about the Waterfall model, I'd be totally okay with it completely dying off. Really, I am. I will not miss it. I will have a really big party on the day that the last Waterfall project dies.. er, I mean completes.

Perhaps "it works" in that people actually deliver software with some level of quality - good or otherwise. That doesn't really matter to me.

There is a flu-like sickness running through the IT industry as people/consumers have become used to a certain level of dissatisfaction and problems associated with the computers, electronics and software that they use in their daily lives. I see this sickness as having a direct 1:1 correspondence to the poor development practices that are cobbled together and used to create the needed products and services.

I like your blog post. I think it's important to frame the usefulness of the Waterfall model in the context of the one-step-up-from-a-loom computers that were in use 50 years ago. Socially speaking, the 1960's were also a very different time than today in how people interacted with each other. For instance, slavery was still fresh in a lot of people's minds, if not still practiced.

Continuing to use this model today, in context of current computer hardware and software technology, tools, approaches, social relationships and structures is, in my opinion, sheer and complete idiocy.

Your final note that "we no longer need it" is correct but perhaps a little "soft" in my opinion. That model is causing harm and preventing us from reaching our true technological and human potential. That model creates an aristocracy and form of slavery in today's world that has become the accepted norm. Too many people turn a blind eye to this reality.

This brings me to perhaps the only cautionary note I have for this blog post. I fear that the title "Waterfall Works!" will be the rallying cry of the dimwitted and the aristocratic control-mongers out there. In today's ADD, information-riddled, marketing-bombardment world, some people don't make it past the title, let alone make it to page 2.

Cheers. Paul.
Solutions can be upgraded and modified time to time.Agile software development may have an overview of the software’s features but it accepts any change of direction of the existing plan that increases the efficiency of the output.
David Locke said…
Where the waterfall fails due to sub-optimality, Agile fails as well. Where Agile is better, situations when the developers have access to the users, internal development projects is used to push the notion that Agile is better in all situations. But, in market facing, commercial products that user is a statistical aggregate, so the averaging of the delivered functionality is still with us just like with the waterfall.
Dave Rooney said…

Thanks for the very insightful and heart-felt comment. I do plan to respond, but it may take an entire blog post to do so!
Dave Rooney said…

DL: "Where the waterfall fails due to sub-optimality, Agile fails as well."

Yes - neither is perfect, and neither can be. There are these pesky things called 'people' that populate the teams that use the processes. If it weren't for them, software development would be very easy indeed! :)

DL: "Where Agile is better, situations when the developers have access to the users, internal development projects is used to push the notion that Agile is better in all situations. But, in market facing, commercial products that user is a statistical aggregate, so the averaging of the delivered functionality is still with us just like with the waterfall"

I'm not sure what you are suggesting here. From my perspective as someone who has been in the 'Agile world' for 10+ years, I would suggest that the ability to adjust to what that statistical aggregate is actually buying would be a tremendous ability for an organization to have. Indeed, that's what agility is all about.