We're good... Right?
How do you know when you are performing at your best? What measure is appropriate for performance of software development?
With the waterfall mindset, a project is performing at peak efficiency when the project comes in on-time and under budget. However, this really says little about the efficiency of the software development process. Two identical projects may be greatly successful in one company, and a dismal failure in another simply because they started with different timelines and budgets.
I recently read a book by David Anderson which gives an interesting insight into this problem. The measurement should be the throughput of value through the software development process. Its an interesting concept which I've been thinking about lately. At first I was very skeptical of the 'Assembly' line metaphor for software development, since it is at its heart, still very innovative. However, as the author lays it out, I think several parallels can be made.
The theme is that the software development process should examined using many of the 'Just-In-Time' manufacturing concepts from operations management and research, namely the 'Theory of Constraints' (TOC) pioneered by Eli Goldratt. The idea is that the output of your system is governed by the bottleneck. All other components of the system need to operate at the same pace as the constraint, or you are just generating waste.
While the TOC is a valid observation in software development, and certainly should be considered when managing your process, I thought the measurement of productivity was the most interesting. In my experience, managers are looking at a timeline orient view of how effective their employees are. On some projects, I've been able to easily meet the expectations simply because the schedules were padded. Alternatively, I've been subject to missing deadlines that were completely unrealistic, and performance judged poorly.
Instead of relying on assumptions made at the beginning of the project with poor information, Anderson lays out the idea that throughput of value to the customer is the best measure of performance. Every feature that the customer wants in the system is kept on the project backlog, and after an iteration, the completed features are removed from the system and recorded. The number of features completed within an iteration represents the value created in the iteration. By keeping track of the features in 'Inventory' (The Features on the backlog) and the features in Finished Goods (In Production) the manager can track the performance of a software development system.
If the quality of the output is low, the team will spend more time doing rework that does not enhance value, thus there is an incentive to send out high quality code to the users.
I would recommend reading the book to get a better description of the throughput of the software development team.
I recommend a couple of additions to this concept of throughput.
1) All features are not created equally: Have the customers assign a value rating (ie. 1-5) to a feature. This will help the team prioritize the backlog and work on the highest value items first. The throughput of the system will take into account the value generated by each feature.
2) Measure the complexity of a feature. Either by breaking a feature out into several sub-features of equivalent difficulty, or by rating each feature's complexity (ie 1-5) this helps a manager gauge the level of effort put out during an iteration.
3) Value in uncertainty: If a feature has a lot of uncertainty in either the functional requirements or technology risk, value is derived by gaining more information and reducing the uncertainty.
4) Option Value: Some features are dependent on the completion of others. The intermediate feature may not present a lot of value in itself, other than you know have the ability to execute another feature.
So the overall performance of a team is the value create over a period of time, considering the effort the team put into a feature set, and any uncertainty reduced or new features that can be put into future releases.