Lately I’ve been pushing leaders to stop having their teams walk them through their analysis and just start with the conclusion. I’m finding that many people are uncomfortable with this. Leaders don’t want to make decisions based on bad data. Even more, they don’t want to send bad data to their boss. So how do you balance checking the data without slogging through the entire analysis all over again? The answer is to get the data in the context of pushing the conversation forward rather than backward.
Leaders create meaning through their context and perspective. Reviewing data is no different. Suppose you are reviewing recommendations on which regions need help improving sales. Don’t ask your team to provide specific details on how they analyzed the sales data or how each individual region did – that’s just rehashing old work.
Instead ask about the conclusions. Is there a region that you believe (or know) has been steadily improving in the past six quarters? If so, ask if they saw that trend? It’s a simple question but it will continue to drive toward greater understanding and it will “check” their process. If they didn’t look at trend data you’ll find out. If they forgot that region you’ll find out. If they did both, you’ll have saved time by not talking about a process that was done correctly and you’ll move the conversation forward by talking about what the data means.
Another option is to ask questions around what you expected to see. Does the data confirm what you suspected? If so, why did you suspect that? Ask questions that confirm or refute those assumptions. Similarly, if the data doesn’t agree with what you expected, center your questions on those assumptions.
Asking good business questions, based on the context and perspective you have about the organization will more efficiently and more effectively move the conversation forward and check the process. It will also give your people a greater sense of empowerment since you are engaging them in a conversation about the business rather than just checking their work.
Here are a few tips to keep in mind:
- If you are not confident in an individual’s ability to execute the analysis properly (e.g., do the math correctly, get numbers to foot, etc.) you need a different person (or you need to develop the one you have). It’s not in anyone’s interest for you to redo someone else work in the guise of a review.
- When the stakes are high and accuracy is critical, build a quality process that ensures that the mechanical parts of the analysis are done correctly. Your value isn’t adding and double-checking numbers. It’s in interpreting them.
- If you find that you spend more time on the details than on interpreting them, find out why. Are you unsure about the broader business issues? Did you commission an analysis without first thinking through the problem and what you expected to see? We often dive into the details because it’s safer than confronting our knowledge of the big picture.
Brad, this is great, building on your earlier post. I’m interested in your experience with organizations that have been “trained” to use data as a way to either justify actions or decisions. Managers in such organization would look at your suggestion and say that you’re crafting the data to support your conclusion. I agree that providing the supporting context would help, what are some other tools/tips do you have that would move such an organization away from using the data as a blunt instrument for actions?
Hey Chris,
Great question and I get that a lot. The answer is really at the heart of why this is a leadeship blog and not an analytics blog.
Basically, my recommendation to put out my asusmptions is not so I can use data to only support my views. What I didn’t say (and perhaps should have) is that, as a good leader I have to be willing to find out that my assuptions were wrong. That’s the difference. The people who use data to support their arguments are the ones who supress any data that doesn’t support them.
I want to get into a dialog about my argument to find out where the holes are or how I might need to adjust my thinking. Of course, the first step to doing this is that I must have to be introspective enough to even understand what those assumptions are in the first place.
The reason I said that this is a leadership issue and not a data issue is that I believe this is about integrity and openness. The leader who only uses data when it supports his or her viewpoint is probably the same leader who surrounds him/herself with people who think in the same way, don’t dissent, and only serve to prop him or her up. Not very effective in my book.
Good leaders put their cards on the table. Sometimes their cards are right and others learn from them. Sometimes their cards are wrong and they learn from others. Either way, the organization as a whole is getting smarter.
Brad,
I grasp the broad message but am still struggling back here in kindergarden. Can you recommend a good fundamentals book for someone still trying to stick their pinky toe into the pool of data analytics? All of this is very intimidating. Thanks.
It will depend on what is causing the confusion. In a sense, while I generally bill this type of topic under “analytics”, it really is a bit different.
Analytics is about getting to the number, the leading with data posts are what you do once you have that number. My guess is that you are having trouble on the analytics stuff.
Here are a few recommendations:
Competing on analytics by Tom Davenport and Jeanne Harris
Moneyball by Michael Lewis
Undertanding Variance by Donald Wheeler
Supercrunchers by Ian Ayers
How to measure anything by Douglas Hubbard
I’d also recommend a basic statistics course. There is a great self study series from The Learning Company (www.teach12.com) called “Meaning from Data: Statistics Made Clear” and “What are the Chances: Probability Made Clear”. The set is on sale now for $99 (I get no proceeds from them).