Reporting

Note

Coaches, Upper Management

We inspire trust in the team’s decisions.

You’re part of a whole team. Everybody sits together. An informative workspace clearly tracks your progress. All the information you need is at your fingertips. Why do you need reports?

Actually, you don’t need them. The people who aren’t on your team, particularly upper management and stakeholders, do. They have a big investment in you and the project, and they want to know how well it’s working.

Progress reports are exactly that: reports on the progress of the team, such as an iteration demo or a release plan. Although progress reports seem to exist so that stakeholders can monitor and correct the team’s direction, that’s not their purpose. Instead, good progress reports allow stakeholders to trust the team’s decisions.

Management reports are for upper management. They provide high-level information that allows management to analyze trends and set goals. It’s not information you can pick up by casually lingering in an open workspace for an hour or two every month; it includes trends in throughput or defect rates.

What kinds of reports do you need to build trust and satisfy strategic needs? It depends on the stakeholders. Some stakeholders are hands-off and just want to see progress toward a goal; others want to know more details so they can build broader knowledge. You may need to produce a variety of reports for different audiences.

Be careful, though—reports take time and energy away from development, so don’t produce every report you can imagine. Provide just enough reporting to satisfy key stakeholders. The project manager and product manager should combine their knowledge of stakeholders to gauge the proper level of reporting. The best way to know, of course, is to ask.

The range of reports you can produce is as broad as your imagination. The following sections list some that I’ve found particularly useful or, contrarily, common and unhelpful. The first set of reports are a normal byproduct of the whole team’s work. The rest are usually the project manager’s responsibility, though they depend on some input from the rest of the team.

XP teams have a pronounced advantage when it comes to reporting progress: they make observable progress every week, which removes the need for guesswork. Furthermore, XP teams create several progress reports as a normal byproduct of their work.

Useful and free? There’s little not to like about these four reports.

The release and iteration planning boards already posted in your workspace provide great detail about progress. (See Figure 8-4, a release planning board, and Figure 8-9, an iteration planning board.) Invite stakeholders to look at them any time they want detailed status information.

For off-site stakeholders, consider using a webcam or regularly posted digital photos to broadcast the plans.

A burn-up chart is an excellent way to get a bird’s-eye view of the project (see Figure 8-7). It shows progress and predicts a completion date. Most teams produce a burn-up chart when they update their release plan.

If your stakeholders want more information, consider providing one or more of the following reports. Avoid providing them by default; each takes time that you could spend on development instead.

Whereas progress reports demonstrate that the team will meet its goals, management reports demonstrate that the team is working well. As with progress reports, report only what you must.

Software development productivity is notoriously difficult to measure [Fowler 2003]. It sounds simple—productivity is the amount of production over time—but in software, we don’t have an objective way to measure production. What’s the size of a feature?

Instead of trying to measure features, measure the team’s impact on the business. Create an objective measure of value, such as return on investment. You can base it on revenue, cost savings, or some other valuable result.

Coming up with an objective measure of value is the most difficult part of reporting productivity. I can’t provide specific guidance because the metric depends on what’s important to your business. Your product manager and upper management should be able to help create this measure.

Once you have a measure, track its value every iteration. Until the team releases software to production, this number will trend downward, below zero. The team will be incurring costs but not generating value. After a release, the trend should turn upward.

The primary complaint I hear about this metric is that it’s partially outside of the team’s control. What if the sales staff doesn’t sell the software? What if the business users aren’t interested in the software?

These are valid concerns, but they ignore the reality of organizational success. For your team to achieve an organizational success, not just a technical success, your software must provide business value. This productivity metric reflects that fact.

To score well on this metric, you should have a team that includes on-site customers. These customers will figure out what customers or users want and show key stakeholders how to sell or use the software. By doing so, they will help turn technically excellent software into truly valuable software.

If the project is under time pressure—and projects usually are—stakeholders may want to know that the team is using its time wisely. Often, when the team mentions its velocity, stakeholders question it. “Why does it take 6 programmers a week to finish 12 days of work? Shouldn’t they finish 30 days of work in that time?”

Although I prefer that stakeholders trust the team to schedule its tasks wisely, that trust takes time to develop. In the beginning, I often produce a report that shows how the programmers are using their time. This report requires that programmers track their time in detail, so I stop producing it as soon as possible, typically after a month or two. To keep the burden low, I ask programmers to write their times on the back of each iteration task card (see Figure 6-5) and hand them in to the project manager for collating into these categories:

Graph the total time in each category in an area chart with “Developing” on the bottom and “Unaccounted” on top. I mark the bottom three categories green; the fourth yellow; and the top white (see Figure 6-6).

Once the report is in place, stakeholders may still wonder why velocity doesn’t match effort. Explain that velocity includes a scaling factor to account for estimate error and overhead. See Explaining Estimates” in Chapter 8 for more ideas.

Some reports, although common, don’t provide useful information. Estimating” in Chapter 8 provides suggestions for explaining your estimates and velocity.

Note

If a stakeholder asks you for one of these reports, don’t flatly refuse. Instead, find out why she wants the report and see if there’s a better way to meet her need.

What do you mean, “Progress reports are for stakeholder trust”? Shouldn’t we also report when we need help with something?

Absolutely. However, progress reports are for status; you shouldn’t assume that anyone actually reads them. Sometimes their existence is enough to satisfy stakeholders that you’re on track.

When you need a stakeholder’s help, whether to learn more about the business priorities or to overcome a hurdle, ask for it. Don’t rely on stakeholders to notice something in the reports.

What if some of our stakeholders want to micromanage us?

The product manager and project manager should manage the stakeholders. They should give them what they need while shielding the team from their micromanagement. They need to be tactful, yet firm.

Isn’t this just busywork? We have an informative workspace, stand-up meetings, and iteration demos. Stakeholders and managers can visit any time. Why do they need reports?

Ifyour stakeholders attend your meetings and get sufficient value out of them, you probably don’t need reports. In that case, the project manager should talk to stakeholders about cancelling unnecessary reports.

Until you reach that point, don’t assume that writing solid code, delivering working software, and meeting real business needs will make everyone realize your value as a team. Sometimes you just need to staple a cover page to your TPS report in order to fit in.

What if programmers don’t want to track their time for the time usage report? They say they have better things to do.

They’re right—tracking time is a wasteful activity. However, the team has to balance the need to satisfy stakeholders with the need to use its time wisely.

You can make this decision easier to swallow in two ways. First, don’t mandate the report unilaterally. Instead, discuss the reasons to produce reports as a team and come to a joint conclusion about which reports to provide. Keep in mind that some team members have greater practical insight about the consequences of reporting, or of not reporting, than others.

Second, do everything you can to keep the time-tracking burden low. When it’s time to produce the report, the project manager should collate the data rather than asking programmers to do so.

Why should the project manager do all the grunt work for the reports? Shouldn’t he delegate that work?

The project manager’s job is to help the team work smoothly. He should never add to the workload of the people on the critical path. Instead, he should remove roadblocks from the path, and reports are one of those roadblocks.

Our organization measures employees individually based on the contents of certain reports. What do we do?

XP teams produce work as a team, not individually, so this is a difficult situation. First, the project manager should review the evaluation policy with HR and upper management. If there is any flexibility in the process, take advantage of it.

If there’s no flexibility, work within the review process as much as you can. Highlight teamwork wherever possible. When the team implements a particularly valuable feature, be sure to mention everyone’s contribution.

Appropriate reporting will help stakeholders trust that your team is doing good work. Over time, the need for reports will decrease, and you will be able to report less information less frequently.

Time spent on reports is time not spent developing. Technically speaking, reports are wasteful because they don’t contribute to development progress. As a result, I prefer to produce as few reports as possible.

Computerized planning may seem to make reporting easier. Unfortunately, it tends to do so at the expense of collaborative, dynamic planning (see The Planning Game” in Chapter 8), and an informative workspace. That’s backward: it optimizes a wasteful activity at the expense of productive activities.

To avoid creating reports manually, I use the iteration demo, planning boards, and the burn-up chart as my only reports whenever I can. They are a normal part of the process and require no extra effort to produce. I use webcams or a digital camera to broadcast the boards if necessary.

I prefer not to report time usage, as it’s time-consuming to produce and programmers don’t like to collect the data, but I usually have to. While I only report it when I sense concern from important stakeholders about the team’s velocity, that concern is typical for companies new to XP. Reevaluate this need about once a month, and stop producing the time usage report as soon as you can.

Frequent communication can sometimes take the place of formal reporting. If this option is available to you, it’s a better option.

Why Does Software Cost So Much? [DeMarco 1995] contains an essay titled “Mad About Measurement” that discusses challenges of measuring performance, and includes a brief investigation into the results of reporting cyclomatic code complexity.

“Cannot Measure Productivity” [Fowler 2003] discusses the challenges of measuring software development productivity in more detail than I do here. http://www.martinfowler.com/bliki/CannotMeasureProductivity.html.