How much analysis is enough? You can always find more data, more questions, and more procedures that can be applied to data. It can be daunting. It can also lead to “analysis paralysis,” which seems to be the main concern of the leaders with whom I work.
Five simple criteria will help you assess the thoroughness of your analysis. These aren’t rocket science but they will help create some diligence and structure to your analysis.
You’ve probably done a thorough analysis if:
· You’ve answered your question
· You’ve used multiple, disparate, and distinct sources
· The data are consistent –there is low variability
· The result makes sense and can be explained easily
· You’ve found the exception (there always is one)
You’ve answered your question
This might seem obvious. But it’s probably the biggest mistake on the list. Too often, we jump into the data without clearly defining (or without defining at all) the question we are trying to answer. We just start reading the reports, left to right, top to bottom.
Before you jump into the data, figure out what question you are trying to answer. Then figure out the criteria (or sub-questions) that will help you answer that question. Now, go answer those questions. When you are done, you’re done.
But how do you ensure that you haven’t missed an important sub-question? The answer is easy – talk to people before jumping into the data. Find out if they agree with the model that you’ve laid out (question and sub-questions). Ask if there are other questions that you need to answer.
More importantly, ask the person to whom you will be presenting the data. That will help you avoid doing unnecessary analysis and will help ensure that you don’t repeatedly go back to the drawing board answering additional questions. It also improves your chances of gaining buy-in on your findings since you already know what that person is looking for.
You’ve used multiple, disparate, and distinct sources
The two most important words here are disparate and distinct. Using sources that share the same beliefs or agenda isn’t much better than using just one source. For example, using only environmental organizations as sources of data on global warming might not give you a fully credible view. Mix your sources. Often opposing groups use similar data for their analysis. What’s different is their interpretation of that data. If two opposing parties use the same data, it is probably pretty credible. If their data is different, keep searching until you find some common facts upon which both agree. If you can’t, try to find some additional sources that might be more objective in the first place. A current example of this is our national debt. Most politicians seem to agree on the size of the debt. That number is pretty safe. There is huge disagreement on how much the debt will be reduced by various proposed initiatives. Therefore, that data is probably more suspect. The Congressional Budget Office was established to provide an impartial quantification of budgetary information. On the issue of savings, they are probably a better source than the spokesperson for either political party.
The data are consistent
I monitor my credit score on a monthly basis. The three agencies never match. But, they are consistent. All three paint the same picture of me for creditors. If they varied widely, that would be a problem. Keep looking at the data until it begins to converge. It doesn’t have to match, but if the data is telling three different stories, you need to get some clarity.
The result makes sense and can be explained easily
If you have to take a lot of exotic twists and turns to build your argument, be suspicious. One of the many reasons cited for the financial crises of the past several years is that financial instruments and analyses became too complicated to understand. Clarity and simplicity are both results of clear thinking and analysis. Keep working until your argument is clear, concise, and easily understood.
You’ve found the exception
If all of your data fully supports your conclusion, you might be suffering from “confirmation bias”. Confirmation bias is the result of only seeking information and data that support your view. Even the best solutions don’t work every time. If you’ve looked hard enough to find the exceptions, you’ve probably done a thorough analysis.
There is always more data and more questions. However, if you’ve been diligent in your analysis, you can be confident that your conclusions will hold up. They might not always be right, and there might be something that you missed, but you and your analysis will still be credible.
Brad Kolar is an executive consultant, speaker, and author. He can be reached at brad.kolar@availadvisors.com.