28 September 2011

The Need for Speed... and Why That's Bad!

I've written before about how software development teams are obsessed with going as fast as possible, and how that isn't necessarily a good thing.  That obsession isn't something that magically appeared with Agile methods - it has been around as long as I've been in the business.  I would assume that the mantra of "we want it yesterday, perfect and for free" came from somewhere! :)

I would suggest, though, that the 'need for speed' has become more acute with Agile.

Groups with whom I've worked are typically coming from a world of serial, phase gate processes.  A ton of time is spent defining and analyzing what will be built up front, and the construction phase is only a relatively small portion of the overall effort (at least on the Gantt charts!).

The approach that Agile methods take is to shift the beginning of construction forward, working from the start to verify the assumptions and decisions in the form of a real system.  This is absolutely a good thing, but I routinely find groups who think that ALL of time that would have been spent analyzing and designing the system can be replaced by a few days in a requirements workshop.  After a couple of days in a week-long workshop people become frustrated and start saying,
Can't we just go start building this now?!
Generally this sentiment comes from people who had not been involved very much in the requirements and analysis phases while using a traditional process.  By the time these people had received the specs for the thing they were supposed to build, they were already under schedule pressure and thus felt that any time discussing what is to be built is waste.

Using an Agile process is different.  Yes, we absolutely advocate spending much less time up front determining what is to be built.  We don't, however, advocate believing that in a few days the requirements for a complex system that integrates with other complex systems can be fleshed out into user stories that several teams can immediately pull into an iteration!  Jeff Patton suggests that 1 to 2 weeks is sufficient to create plans for 3 to 6 months of work.  Others suggest anything from a few hours to "as long as it takes".

Context, of course, is everything in this case.  One system may be able to make use of a Feature Fake that takes a couple of hours to hack together in order to run an experiment which will drive the real requirements.  For another system, the Feature Fake concept isn't just impossible, it would be illegal!

So, in many cases you can't simply jump from many weeks or months of analysis and design to a couple of hours in a workshop and expect to have an adequate level of understanding of what's to be built.  You need to take the time - slow down - at the beginning to create a shared understanding of the business problem to be solved, and to an extent how you're going to solve it.

"HERESY!! Agile is all about going faster!", you say?

Well, no, it's not.  It's about doing enough to be able to build the right thing at the right time.  The definition of enough will vary from domain to domain, system to system, and even team to team.  By virtue of not doing things that you don't need and deferring things that aren't important right now, you will indeed appear to go faster.

It's altogether possible that given the same set of high level requirements, an agile process may take longer to deliver all of them than a serial process.  An agile process, though, would seek to identify the most important subset of those requirements and deliver those as early as possible in order to obtain real-world feedback and incorporate that into the product.

Agile doesn't simply mean "fast".  It also means, "able to change direction quickly".

22 September 2011

Who Should Perform the Sprint Review?

A concerned ScrumMaster came to me recently lamenting the fact that a Product Owner "ran" a sprint review.  My response was, "That isn't necessarily a bad thing."  The ScrumMaster then went on to describe how the PO in question directed everything in the meeting and didn't allow much discussion.  OK, so some context helps clarify things. :)

The question remains, though, is it always a bad thing for the Product Owner or Customer to perform the demo at the end of the iteration or sprint?

The Scrum Guide is certainly clear that the team performs the demo, but I come from the XP world where the Customer is involved on a daily basis (remember the 4th principle of the Agile Manifesto) and is seeing and accepting completed work as near to when it's completed as possible.  When that's the case, the demo is no longer for the benefit of the Product Owner but instead for the stakeholders outside of the team.

When I train & coach teams, I actively encourage them to have the Customer/PO perform the demo.  This, in turn, encourages the behaviour that the Customer is actively engaged with the team and that the work is truly done to the Customer's satisfaction such that they can present it to the stakeholders.  Since the Customer either is from or represents the business, the work completed is presented from that business perspective rather than from the team's technical perspective.  Being able to speak the same language as the stakeholders is a key aspect of this.

Of course, there is no absolute rule here.  I've seen many demos where the Customer sets the stage and the team members do the hands-on part.  For example, early in a system's development a development tool may be used to show that some background process worked because the stories behind that functionality had higher priority than the stories that would allow the system to show the results of the process.  The Customer may not know how to use the tool, but a team member does!

I've also seen demos performed by the team where the Customer is seeing the work for the first time at the demo, and they have gone just fine.  I've seen Customers who are more technical-facing than business-facing micromanage the work that the team is doing.  I've seen Customers who are completely disengaged and don't even show up to the demos, complaining many iterations later that the system doesn't do what they want.

In general, though, if your Customer or Product Owner is business-facing then I have found that work flows much more smoothly when that Customer is actively engaged with the team and is the person performing the demos.

I'm very interested in your thoughts... please comment!

16 September 2011

Technical Excellence in Scrum

Jeff Sutherland recently posted a message on the Scrum Development Yahoo Group regarding Scrum and Technical Debt.  Jeff mentioned why Scrum eschewed technical practices, specifically:
In 1995, Kent Beck asked me for everything on Scrum.  In a famous email he said he wanted to use everything he could and not reinvent the wheel.  The first Scrum team was doing all the XP practices in some form.  However, in 1995 when Ken Schwaber and I started rolling out Scrum to the industry, Ken though [sic] we should focus on the framework as it would lead to more rapid adoption and teams should use the impediment list to bring in the engineering practices as needed.
I applaud Jeff for saying this, and one certainly can't argue with Ken's assertion that avoiding technical practices would lead to more rapid adoption.  I would argue, though, that very few teams use the impediment list as a means for improving their technical practices on an as needed basis.  In my own coaching experience, that number is painfully close to 0.

Jeff goes on to call out the coaching and training community:
At Snowbird this year, Agile leaders from all over the world convened to do a retrospective on 10 Years Agile. The prime directive that was unanimously agree upon by all present was that in the next tens years Agile leaders must Demand Technical Excellence. Failure to do that means you are not an Agile leader. We are sloppy in our coaching and training. If stuff is not done and/or has bugs at the end of the sprint, the team is not showing technical excellence and is not agile. We need to be clear about that.
And:
One of the reasons we have so much technical debt is Agile leaders are not coaching and training well enough.
I suppose that from a results-based view, Jeff is right.  We obviously aren't coaching and training well enough.  However, I don't feel that's universally the case, and I've certainly pushed for technical excellence since I first started telling others about XP in 2001.

I responded to Jeff, venting some of my own frustation in the process.  Here is that response:

Hello Jeff,

I agree 100% with everything you say on this topic except, "...Agile leaders must Demand Technical Excellence. Failure to do that means you are not an Agile leader. We are sloppy in our coaching and training."

If I had a dollar for all of the times I have pointed out practices that were contributing to technical debt, I could take a very nice vacation.  If you add the number of times I've shown practices (through demos, pairing, etc.) that help reduce technical debt, I could extend that vacation significantly.  If you add the number of times I have received the "we don't have time" excuse, despite my pleas to consider all of the issues leading to debt as impediments, I'd be getting into early retirement territory.

I learned "Agile" from XP back in 2000.  Prior to that I did what I could to achieve technical excellence.  The XP technical practices, most importantly TDD, improved my level of excellence considerably, and I've been teaching those practices ever since.  I'm constantly astounded by the number of supposedly competent software developers who don't believe that writing any automated tests for their code is worthwhile, let alone using a practice like TDD.  I have repeatedly shown how TDD results in simpler, more robust code, and yet there is still a huge amount of skepticism about the practice.  Don't get me started on Pair Programming.

I have beaten myself up many times thinking that I'm not coaching well since the people with whom I'm working aren't using these practices, and don't see the value despite their velocity eroding after a half-dozen sprints or so.  I've beaten myself up when these teams don't think it's a big deal when backlog items aren't complete at the end of a sprint, despite my advice to simply finish what can be finished to the DoD and use the Retrospective to figure out why some items weren't completed.  I've beaten myself up when teams don't listen to my advice about ensuring that all of the team members are on the team 100% of their time, and people get pulled away and not all items are completed in a sprint.

Frankly, I'm tired of beating myself up when people don't want to listen, but that's a different issue.

Over the past few weeks I've done a lot of soul searching with respect to my work as a coach, discussing it with other people ranging from my wife to the original XP Coach himself.  I'm by no means perfect, but my conscience is clear when I say that I have NOT been sloppy in my coaching and training.  I have busted my ass trying to get people to adopt the types of practices that will lead them to technical excellence.

I suppose you can lead a person to knowledge, but you can't make him think.