Most people would agree that good reports simply and clearly answer questions. Yet, while most reports provide the information that is needed to answer a question, many fall short of actually answering it.
Here is a simple test that you can use to see if your reports are designed effectively. Give the report to someone who doesn’t understand your business. Tell them the question you are trying to answer or the decision that you are trying to make. If they can’t answer the question or make the decision, then the report needs fixing.
If a report is meant to show where there are problems, anyone who looks at it should be able to identify the problem areas (even if they don’t understand what those areas are or why they are problematic). If the report is meant to help you figure out where to spend more marketing dollars, anyone who looks at it should be able to answer that question, even if they don’t understand your marketing structures or processes. Reports are meant to answer questions and need to do so simply and clearly. It’s the interpretation of those answers which require an understanding of the business.
Too often, reports just provide a data dump that requires the reader to connect the dots. For example, many operational reports show daily or weekly statistics on queue times, completion rates, FTEs, quality metrics, and productivity metrics. However, they don’t provide the targets for those metrics. The assumption is that the reader knows the appropriate ranges into which each metric should fall and can assess them based on that. While true, it’s not efficient. Why give the reader the extra cognitive task of recalling the ranges/targets if they are already known? It doesn’t matter if the person should or does know them. By including them, you make the report more usable and speed up the time it takes to make a decision.
Even reports that take the next step and provide reference data sometimes still fail to actually answer the question. For example, a typical business scorecard report might list a series of metrics and their current performance, targets, and historical performance. Figuring out which metrics are on track and which are having problems is relatively simple tasks of comparing the current performance against the target or past performance. However, despite its simplicity, it still adds an unnecessary cognitive burden on the reader. If you know that they are going to make those comparisons, why not just provide them in the report (e.g., indicate where they are hitting or missing the target)? That way the reader can focus more time on making sense of the problem areas and less time manually identifying them.
Some reports attempt to provide the answer by using color coding or labels to highlight which items may need attention or action. That’s getting closer to the goal of using reports to answer questions and drive decisions. However, there is one more step than can further increase clarity and reduce the burden on the user. Instead of mapping status to the various data points (e.g., coloring each metric read, yellow, or green) you can map the data points to the status. In other words, turn the formatting upside down by organizing the report around the status colors. That way, items with similar status (red, yellow, green) all appear together. Ultimately, that is what the reader is trying to figure out so why not present it that way? When I ask people what question they are trying to answer on their reports, many will say, “I want to know what is red, yellow, or green”. They don’t say, “I want to know the current performance and target.” They need that information but only to answer their real question which is where they need to focus and where they are doing ok.
It’s not hard for a reader to go through a color-coded report noting all of the red items but it’s unnecessary. It requires focus and attention that is better used for making sense of the red items. Why force someone to go through the entire report and mentally group the similar items when you can just display them grouped together in the first place? These types of reports, which I call “decision-based” reports, provide the clearest and simplest way of presenting your data. They require almost no effort to answer the question or make a decision because the data are organized around the answers or decisions. They also make it easier to spot patterns. If all items of a similar status are listed together, it’s easier to see what, if anything, they have in common. Anyone who looks at the report, regardless of his or her understanding of your business can easily figure out where the problems lie.
This approach isn’t just for status-based reports. It applies to any report in which you label or categorized data (which should be all of them). For example, you can apply this concept to reporting the annual ratings of your people. Instead of tagging each of your people with their rating, flip it and tag each rating category with the associated people. That way, you can quickly see who the top, middle, and low performers are.
The more time and energy your readers spend finding an answer on a report, the less time they have to make sense of that answer. The real value in using data isn’t just in spotting issues – that part is relatively easy and mechanical. The value comes from figuring what to do about those issues. Design your reports so that they move people as quickly as possible from identifying the problem to thinking about how to solve it.