24 March 2014

Gourmet Crow, or Wearing a Different Hat

For readers to whom English isn't their first language, we use the phrase "eating crow" to describe a situation when you must admit that you were wrong after taking a rather strong position about something. While this isn't exactly that case, hence the second title, it does illustrate a lesson in perspective.

First Hat - "Sherpa"

While I was working at Shopify, one area on which I focused was product quality. The good news was that there were (and still are) tons and tons of automated tests for the system. What I felt needed improvement, though, was a more widespread application of testing practices to ensure that the system worked as intended beyond a superficial run on a developer's laptop.

This differentiation is likely why people like Michael Bolton prefer to use the term "check" rather than test, because "testing" is a very different activity. Using that terminology, my view at the time was that Shopify had plenty of checks but not enough testing.

Like any large system, there were defects and my position was that the group needed to improve their testing skills in order to prevent more defects from making it to production.

Second Hat - "Merchant"

Then something rather funny happened. After I left Shopify and rejoined the consulting world, I decided to use my own Shopify store as the platform for my services and some fun Agile swag ideas in The Agile Store. I was no longer viewing the system from the perspective of an insider who knew about the defects, but rather as a customer working with the product.

In 2 weeks of setting up and tweaking my store, I believe I encountered one single "glitch" that I barely even noticed and easily worked around. Even my critical eye was surprised at how smoothly it all worked. That isn't to say that the software is perfect - I know that by entering boundary or out of range values in some places I can cause some errors - but in everyday use from the perspective of a consumer of the software, it works not just fine but quite well indeed.

It was a good lesson on the need to balance striving for perfection from a development perspective vs. striving for the best customer experience from a business perspective.

Of course, you have to determine the optimum balance for your particular domain.  For example, it's tax season, and let's think about the tax return preparation software you use. Imagine that it has a defect that only occurs under very specific circumstances of income and family size, etc., and it leads you to believe that you're going to receive a generous refund. You happily submit your return and await that nice fat cheque! Except the software was wrong and you actually owe money, so you could be charged interest and possibly penalties. Not only that, the information submitted by the faulty tax return software tweaked something that triggered an audit by your tax authority.

That would likely be considered a suboptimal customer experience, to say the least.

There are domains and circumstances where we can put up with plenty of defects and just keep going. If a game crashes, we likely just start over. If we really like the game, our tolerance of defects is much greater.

And there's always the good old standby of the air traffic control system. Not only does the customer experience have to be good, i.e. the air traffic controllers can easily understand what they see and make appropriate inputs, but the defect level of the software has to be extremely low.

So, have the discussion about the level of tolerance of defects within your domain. At the same time, strive for the best possible software. Even after my experience as a consumer, I would still advocate for better testing. The trick is finding the sweet spot between perfect software and what allows the business to grow sustainably. I still maintain that we can drive software defects to near-zero levels at a reasonable cost, but perhaps my lesson in this case has tempered the passion with which I convey that message.

Oh, and does anyone have any wine pairing suggestions for crow?

20 March 2014

Mandated Agile - A Contrarian View

There was an interesting exchange recently on Twitter about Agile adoptions/transformations that are mandated. Dan Mezick asserted that:
Between 75% and 85% of all #agile adoptions fail to last. 99% of these adoptions are implemented as mandates. Any connection here?
I responded, asking for Dan's source to those stats. His answer was that it was "pro coaches" like myself. What ensued was a long conversation (such as one can have on Twitter) including people such as Mike Cottmeyer, George Dinwiddie, Glenn Waters and others.

Dan's position is that mandated Agile doesn't work, and his Open Space-flavoured version called Open Agile Adoption is much more inclusive and grassroots-driven.

Just to be clear, I have no arguments against Open Agile Adoption and I'm a big fan of Open Space and how it can be used to ensure that many more people are engaged in the work process. There are a couple of things, though, that bother me about Dan's statements.

First, even if his statistics are accurate, I want to see that his sources are somewhat more rigorous than anecdotes from other coaches. Laurent Bossavit has made a side career for himself of challenging statements such as Dan's, with much of what he's found catalogued in his book The Leprechauns of Software Engineering. I'm not suggesting that Dan's numbers are wrong, just that if he's using them to market his own product or service then it behooves him to "show his work", as a multitude of math teachers and profs told me.

My second issue is with the implication mandated Agile is wrong. Of course it would be better if a change such as that began and grew from the grassroots rather than as an imposition from management. Except... I was among the many who had great success with XP in the early 2000's on a single team, but precious little if any success trying to grow it beyond that point. Forget about management, other teams simply weren't interested, regardless of how successful we were.

There's also another funny aspect to this. If I'm working for a private company and the owner wants something done a certain way, it's absolutely her prerogative to do so! If the head honcho wants Agile and says, "Make it so!", then you're faced with a choice: you either work with the owner to make it so, or you can choose to leave.

While that view doesn't fit the mold of how we believe organizations should be run, it is how 99% of them are. OK, so I just grabbed that number out of the air. :) My experience has been invariably that this is how organizations are managed, for better or for worse.

We hear about the Semco's and the Gore and Associates of the world because they're so different, not because many other organizations are like them. Of course we should take lessons from those companies and apply them! But we also have to be wary of cargo-culting such as was done with the North American auto makers with Toyota's manufacturing model.

In the end, though, most businesses are not a democracy. Good ones are a benevolent dictatorship, and the leaders in those companies are much more inclusive of others in the decision making process. The people in those organizations feel valued and are motivated to do great things.

But even in those companies, every so often decisions are made by the top leadership without consulting the masses. Those decisions affect everyone, and are imposed via a mandate. If the people trust that the organization's leadership is making these decisions for solid business reasons, then there really isn't a problem. If the leadership communicates those reasons and the vision behind the change, then the people on whom this mandate has been imposed are much more likely to support it.

Not all mandates are bad, and some are necessary. Creating such a false dichotomy serves no one in the long term.

Since I've now given Dan's Open Agile Adoption some free advertising, I would like to state that my own position is to help groups determine what is most effective in their context. The definition of effective will change from team to team, even within the same organization. It will also change from domain to domain. I have accumulated a lot of great principles and practices in 25+ years, as well as the wisdom to know that "one size fits all" means "it doesn't fit anyone properly". If you think that's interesting, come on over to DaveRooney.ca to see how I can help.


19 March 2014

Solve the Right Problem, Solve the Problem Right

Over twenty years ago I learned a valuable lesson about solving the right problem for the people who used the software that I was building.

I was working on a training management system, specifically on the reports that were needed by the people who handled training for a relatively large organization. There were a number of 'canned' reports, meaning that the format was fixed and the query options were limited to a small set of options. One of the reporting requirements was for an ad hoc report generator that would allow the people in the training department to create and save their own reports. This report engine was intended to have the flexibility that the canned reports didn't.

At the time, I looked at several options including off the shelf report packages (I think an early version of Crystal Reports). However, my software developer "build a better mousetrap" instincts took over and I decided to write the report generator myself.

I used an existing report file format as the starting point after finding some documentation about its binary format. I then extended it to contain some extra information that I needed for my report engine. The big chunk of the work was in building a quasi-WYSIWYG report designer that would give the people the ability to see how fields were being positioned, headers, footers, groups, etc. It was anything but a trivial task and took me about 3 to 4 weeks to have it working in a reasonable manner.

But no one used it.

I gave some demos. I sat with people and helped them create a report. They still didn't use it. From a functional perspective, my report generator was as good as anything you could get off the shelf and as easy to use... at least from my perspective as a developer. From the perspective of the consumers of the system, they just didn't have the time to learn how to use the tool well enough. So, they simply didn't use it.

The a funny thing happened. I received an urgent request for a report that had to join data from several tables and aggregate results. My report engine wasn't built to handle such a report, so I had to quickly throw together a program that could do the work. Once I had the basic report in place, I had the person who requested it come and sit with me to review what I had done. There were a couple of tweaks to make, but it was mostly OK. I asked him if this was a one-off situation, or if that report was going to be needed again in the future. The answer, of course, was the latter - this was a new requirement from upper management.

The minor problem, though, was that I was working on another system for another group in the same organization and couldn't take the time to add the new report into the training management system as one of the canned reports. Also, by that point policies around releasing desktop systems had changed and each system had to go through a test process to ensure that it would play well with other systems used by that organization.

So, I quickly created a UI for the person to be able to enter some query parameters, rolled it all together into an app and handed it to the person on a disk. He was very happy, to say the least, and I may have spent half a day on the work.

A month or two later, he came by and sheepishly said that he was given another report request from upper management. Again, it was just different enough that the ad hoc report generator couldn't handle it. Being a lazy developer, I simply copied the code from the previous report, replaced the report generation code with what was needed for the new report and change the UI to handled the different parameters. Pack it up, ship it off, and you have another happy customer!

Then, the third request arrived. This time, I copied the code but created a 'skeleton' of the app that would generate the report. The UI was blank except for buttons, as was the method that created the report. I now had a template app that just needed the code for the report and UI to allow the user to change the report parameters.

Again, I churned out a report and my customer was a happy camper. Another half day at the most.

The fourth request arrived. After half a day, there was the report's app on a disk and I believe at that point I also provided an installer to simplify that process.

I worked in that organization for another 4 years, and I don't know how many more of those reports I generated. My customer was very happy the entire time. There were other people for whom I built these quick reports as well. In those 4 years, not a single person other than me used the ad hoc report generator that I spent 3-4 weeks of my life building.

The moral of this story is that I wasn't solving the right problem with the ad hoc report generator. Yes, the customer had requested the ability to create ad hoc reports. I translated that requirement into a need to build something into the system rather than the simpler approach of writing a small app for each need. The 3-4 weeks I spent building the report generator was quite likely more than the time spent building the individual report apps.

I also didn't take the time to let anyone else try to build a report using the generator I had written. I likely would have seen much sooner that someone who wasn't me and knew the tool inside and out would struggle creating a report from scratch, and that another approach was needed.

Essentially, if I had thought more critically about solving the problem of ad hoc reporting the system could have shipped 3 to 4 weeks earlier than it did.

And that, I suppose, was the cost to learn what problem I actually needed to solve and how to solve it "right". It's a lesson that has stuck with me, and has been the rope that allowed me to climb out of a number of rat holes since by pausing and asking,

Am I solving the right problem here? Is this the right solution to the problem?

9 March 2014

Interesting - Packing List for Your Agile Journey Virtual Training

My good friend Gil Broza, author of The Human Side of Agile, pointed me towards an upcoming virtual training event called the Packing List for Your Agile Journey. It's a 5-day event with a tremendous "cast of characters", including Johanna Rothman, Arlo Belshee, Ted Young and Paul Carvalho among others.

Gil's approach is to interview each of these 10 industry leaders, having them discuss their own experiences - the up's and down's of moving to Agile methods across the spectrum of organization size and business domains. Some of the topics to be covered include:
  • Organizational Support
  • Team Collaboration
  • Adapting Agile as you Learn
  • Whole Product Thinking
  • Making Quality a Mind-set
  • and many more...
I've known Gil for a long time now, and my advice is really simple - if he's involved, you'll want to hear what they have to say! You can register here.

6 March 2014

Upcoming Book: Effective Software Delivery - Agility Without the Dogma

I've started working on a new book with the rather lofty goal of cutting through the marketing hype and near religious dogma of the various brands of Agile. My focus is on conveying what is effective in the context of a group of people building software in their particular domain.

Effectiveness is the book's overarching concept. There are a multitude of different ways to deliver software, but in the end effectiveness can be distilled into two key activities:
  • Ship something
  • Reflect on how you shipped it in order to improve
Whether you're a lone app developer working in coffee shops or a multinational corporation building equipment that costs millions of dollars, Ship and Reflect still apply if you're going to be effective.

Of course, the details of how you ship and how you reflect are going to vary from team to team and domain to domain, and that's exactly why people and organizations struggle with the different Agile brands. David Anderson, the originator of the Kanban method, made a very interesting statement on the Kanban Dev Yahoo Group back in 2008:
So while I have heard of agile teams that appear to exhibit high maturity behaviors - objectivity, use of leading indicators, low degrees of variability, and (maybe, just maybe) continuous improvement in a failure tolerant culture, I have not heard of one that existed without the direct leadership of one of the personalities in our community. At this point, it is impossible to take the "David" or "Jeff" or "Israel" factor out of the achievement of the high maturity.
That statement really stuck with me, and even more than five years later I have seen teams face similar struggles. That's why, I believe, that a focus on effectiveness versus following a prescribed process is a better approach. What's effective for that lone app developer likely won't be effective for a 50+ person development team building the avionics for a new airliner. What's effective for a group of people customizing a CRM package may not be effective for people maintaining a legacy system running on a mainframe.

The book is intended to cut through the dogma of individual processes to help reveal practices and approaches that are effective in the context of the reader.

Currently I'm self-publishing the book on Leanpub, and you can subscribe to receive updates as they're produced. If you'd like to review the book, please let me know.

5 March 2014

Video of Effective Software Delivery - Agility Without the Dogma

I'm in the process of writing a book entitled Effective Software Delivery - Agility Without the Dogma. This video is an interview with me describing the history and concepts behind the book.

You can register to be notified of updates on the book's Leanpub page.



Please share if you like the video and concept for the book! I will be releasing updates as they're available, with a target of the end of 2014 for full publication.