23 January 2006

Personal Responsibility in IT

A mention on the Ottawa XP mailing list of the division of workload between a commercial pilot and co-pilot reminded me of a post I've been meaning to make for a while about personal responsibility in our industry.

I'm currently in the process of getting my Private Pilot Licence. I flew my first solo back in August, and to do so I had to pass a written exam known as the PSTAR. It's similar in concept to the test you have to pass in order to get your Learner's Permit for driving, although you need 90% to pass. The good thing is, though, Transport Canada provides you with all 200 of the possible exam questions, 50 of which are used on the test.

When I was studying, I noticed that fully 10% of the questions were along the lines of, "If a controller clears you for XYZ and you do not feel that it's safe, do you...", or "In a control zone, whose responsibility is it to...". In every case, it came down to the pilot in command being responsible for the aircraft, even if a controller has given you an instruction or clearance.

In October, I went over to the airport on a nice Saturday morning to practice takeoffs and landings, known as circuits. I did my pre-flight checks of the plane, with the flying school's owner there helping me move the plane out to be ready to taxi. Everything looked fine, and I taxied out and did 7 circuits. The rather odd thing was that I made 7 of my best landings ever, and was quite pleased with myself! After my last landing, I taxied off the runway and back to the ramp at the school.

As I pulled off the taxiway and onto the ramp, I heard and felt a thump below me and felt the plane dip a bit to the left. I thought I had hit a rut in the pavement, but realized very quickly that I had blown a tire! I killed the engine immediately and got out to have a look. The owner was on his way over to see what happened, and we looked at the tire. There was an enormous bald spot where the tire had blown, that had been worn right down past the rubber to the fabric lining. I missed it on my preflight, and the owner didn't see it either. We figured that it had been facing down when I had a look at the tires. Regardless, it was solely my responsibility to have ensured that the tires were safe for use.

If that tire had blown on the takeoff run or on landing, it would have been quite serious, with the plane and its sole occupant (that would be me) probably departng the runway to the side rather than up! In any case, it would have been a completely avoidable accident if I had just taken a better look at the state of the tires while we were pushing the plane out. Since that time, I have questioned another bald tire (though not as bad), and returned from a practice flight when an engine RPM gauge was acting wonky (that's a technical term). It turned out that the gauge was just affected by the cold, but it also could have indicated an oil leak.

For me, the analogy is that we as programmers need to have the same level of personal responsibility. If we see something that isn't right, we question it publicly or fix it. If we see code that doesn't have tests, we should write them. If we see opportunities for refactoring we should do it. If we see issues with our team, we should make them known and try to have them fixed.

If I didn't question the wonky RPM gauge and there was indeed an oil leak, I or the next person to fly the aircraft could have had an engine failure. Depending on when it happened, it could have been catastrophic and led to the loss of life. While most IT projects don't have the same life-critical implications, we still need to engender a sense of personal responsibility in our work.

20 January 2006

The Disengaged Customer - Introduction

What happens to a successful agile project when the Customer is no longer engaged? This is part 1 of a series of entries of how important the Customer role, specifically, an engaged Customer, is to agile development.

In the next couple of weeks, I'm moving on from a client where I've been working for a little over 2 years now. When I joined this team in October 2003, it was was what I would consider to be a successful XP/Agile project, incrementally delivering value to its Customer. One release had been made to production, and I joined just as a second release was being made. I thought that it was a good sign that by 10:30 in the morning on my first day on the project I was writing production code while pairing with one of the other developers.

I did note that the system's Customer was only present twice a week for meetings with the team. We had a business analyst who met with the Customer more often, but she wasn't really in a position to make yea or nay decisions about the system. However, the Customer, when she was with us, was quite engaged and provided good feedback. She had bought into agile development after a couple of iterations, and seemed to enjoy working this way.

This continued for a few months until this particular Customer was moved to another project, and was replaced. The new Customer was quite laid back, but had a tendency to waffle somewhat on issues. It wasn't unusual to make a suggestion about a story and received, "Sure, that's fine with me." as the answer. A couple of days later, she would change her mind. A couple of weeks later, she'd change it again. This was a little aggravating, but we were able to accommodate it.

This started while we were developing a web component to our system. About a month into this, we were directed by the Gold Owner to work on a new priority project. No problem, we said! This project came with a new Customer, who was as engaged as the first. Her answers were either definitive, or she would go back and get a definitive answer very quickly. We also decided to crank up our agility somewhat, changing to one week iterations from two, and using point-based estimating rather than using actual time. We cranked out that new part of the system very quickly, with few problems, and the team really clicked. I was quite happy with our progress, and figured that we were approaching XMM Level 5!

Continued in The Disengaged Customer - A Funny Thing Happened.

14 January 2006

The Last Straw

I had always tried to be a "good" programmer - analysing, designing, coding and testing in separate, discrete steps. However, I always found myself subverting the process and getting something tangible in front of the Customer in order to obtain their feedback. I actually got in quite a bit if trouble for doing it once. However, one project turned out to be the Last Straw for my attempts to follow conventional wisdom of the time.

A thread a couple of months ago in the XP Yahoo Group reminded me of how and why I came to be a part of the Agile development movement.

A post by Brian Nehl referred to two articles about the development of the Canadian Firearms Registry:
From June 1998 to December 2000, I worked on that project. The article in Baseline magazine is, in my opinion, quite balanced and accurate. Dwayne King, who was one of my colleagues, describes quite well the environment in which we worked. Some things that weren't said:
  • The program was backed at the very highest level of the government and would be implemented come hell or high water.

  • The estimated total cost of the system was based on the development cost of the system that was in place at the RCMP at the time. However, that system only captured the licencing information for people using Restricted firearms, which is only a fraction of the total number that were to be contained in the new system.

  • The application was developed using PowerBuilder 5.0 and Oracle 7.3, which were somewhat long in the tooth in 1998. However, this decision was made in order to minimize the technical risk, and was something with which I didn't have a problem.
The methodology used for the project was straight waterfall, although we were using OO techniques for the design and development (I believe they may have moved to RUP shortly after I left in December 2000). The requirements for the system were drawn from the legislation and the associated set of regulations. The problem with this approach was that quite often we needed legal opinions from the Dept. of Justice over what was meant by certain parts of the legislation. Equally as often, the same question could elicit multiple different answers from different people, or would result in hours of discussion over a single, minute point.

Since there were so many major stakeholders for the system (14, I believe), there was a group that handled all of the requirements definition and management. They were essentially our Customer, but they acted as an insulating layer between the development team and the real Customers. Their 'product' was a big binder called the BPE - Business Process Engineering, and it contained page after page of decision matrices, some of which were even up to date. In retrospect, those matrices could have been great FIT tables, although they often weren't consistent.

From a contracting perspective, it had been a real dogfight among the big SI's to land the development contract. It was eventually won by EDS, with the testing contract being awarded to Systemhouse. The Baseline article suggests that EDS and Systemhouse worked together, but nothing could be further from the truth. There was a lot of bad blood between the companies over the bidding process, and there was a very adversarial relationship between the teams at the management level. It was downright petty at times, although the people in the trenches (i.e. developers like myself and the individual testers) got along fine.

When I joined the project, the development team consisted of approx. 25 PowerBuilder and Oracle PL/SQL developers. This eventually grew to about 35-40 people, although the total number escapes me. Of the developers, almost all were contractors - EDS subcontracted the development to the company through which I was contracting at the time - with a few employees. There was a lot of talent on the team, with decades of combined experience at the analyst/designer level as well as at the developer level. It was, essentially, an all-star team, and the contract paid well. As Dwayne King suggests, the project was well within the technical capabilities of the team. With the benefit of Agile Hindsight, the team could have been considerably smaller, and still produced better results using an agile approach.

When I joined the project they were ramping up for an initial release to production in December of 1998. They had just undergone what they called the 'Alpha' release, which was a proof of concept intended to verify that the system would be workable and to expose any technical risks. There were some substantial performance problems in the application, but these simply required some optimizations of the code which were eventually performed. I was also surprised to find out that the database schema had been determined before any other development had even started.

The development team itself was split up into groups that focused on particular areas of functionality, and were led by an architect/designer. Despite the fact that we were all located in the same place, there was a surprisingly low level of communication between these groups. This led to some significant issues:
  • Different approaches were taken to developing different parts of the same application in terms of class hierarchy, separation of logic between application layers, etc.;

  • Different coding standards were applied between the groups;

  • Different groups at times used the same database columns for different purposes;

  • Code integration was anything but continuous, occurring usually just before a release to the test group. This happened about every 6 months.
Needless to say, integration was a mess. The application would compile on individual workstations, but the whole thing rarely did. The last week before shipping the system to the test group was usually spent just getting the bloody thing to build, let alone perform our own testing.

Once those problems had been ironed out (or just hidden), the system was tossed over the wall to the testing group. They used the (frequently out of date) BPE document as the basis for their tests, and many inconsistencies quickly became evident. Sometimes the code was wrong, sometimes the requirements were wrong. In the end, it always becase a battle.

So, here's a project that had the following issues:
  • The powers that be believed that they could "nail down" the requirements based on legislation and regulations that were still in a state of flux;

  • A group insulated the development team from anything resembling the Customer;

  • The initial estimate of the project was based on "a similar project" that was only similar in that it had information about firearms and ran on a computer;

  • The estimate was not made by anyone from the team that would have to implement the application;

  • The development team was split into multiple groups, each one working almost in isolation;

  • Code integration was infrequent;

  • There was no automated testing of any sort, and no formal unit tests;

  • Acceptance testing was manual, and performed after the fact by a group completely separate from either development or the Customer;

  • Iterative development was not practiced;

  • Releases were typically once every 8 months to a year;
And this system caught the eye of the Auditor General? What a shock.

For a while, I just figured that this was what life was like on large projects with large teams. I did get a reprieve in early 2000 when I was tasked to build a small satellite application that had very focused requirements and an accessible Customer. Once that was finished, though, I had had enough. I actively sought out a new contract, and in December 2000 one came my way.

A funny thing happened during the interview for that contract. About halfway through, I realized that it was for a Java developer position! At that point, I had about 6 months of playing around with Java, but I certainly wasn't in any position to be marketed as a Java consultant. Fortunately for me, this was still during the tech boom here in Ottawa, and companies such as Nortel and Alcatel would hire anyone who had Java on their CV and could fog a mirror. That left a significant void in the contracting world for Java developers, and the manager who interviewed me explained that my O-O experience was what they were really after and I could learn Java while I was working! Uh... OK!

I started working with Jim Leask, a consultant from Sybase. Jim had a couple of years of Java under his belt then, which was pretty impressive considering that Java itself was barely out of diapers. Our mission was to create a framework that would be used by "an army of developers" to build a big, honking case management system for this government department (which is another story unto itself!). The goal was to simplify things such as data access, GUI building, etc. so that they didn't need expert developers to build this thing.

We started out by handling one of the most critical aspects of any system built for the Canadian federal government - bilingualism. We began doing some modeling in Rose, but after a day or so we decided to validate our designs in code. Since I was a Java newbie and Jim was an expert, we decided that I'd do the typing and he would look over my shoulder to guide me as I figured out Java. We threw some classes together and realized that something in our design wasn't quite right, so we switched to his machine and updated the design based on what the code told us. While there, we added a bit more to the design, then switched to the other machine and I again hammered away on the keyboard. This back and forth activity continued for a couple of hours, at which point Jim uttered the immortal words:

"This reminds me of that Extreme Programming thing I heard about."

Once the images of Mountain Dew-swilling snowboarders with laptops cleared away, I did a quick Internet search (I can't even remember what search engine, though I know it wasn't Google). I believe the first site we visited was either Ron Jeffries' XProgramming.com or the C2 Wiki. The next day we hit the bookstore - Jim bought XP Explained, and I bought XP Installed. The rest is history.

So, I find it interesting that my disgust over the classic waterfall practices and the inherent dysfunctionality of a team using them led me to the Agile Promised Land. I will never go back.