Every now and then, you’ll get a reading on a key metric that is just too good to be true, or too bad to be real. When you do, its time to do some challenging of that data before you draw conclusions and “run with it”. Most often you’ll find something that created the anomaly. Even if your initial impressions were if fact accurate, what’s the harm of being a few days off on the reporting timetable in the interest of data integrity? In the end, a small price to pay.
We’re all taught in school to double check our work. Remember reverse engineering your math and see if you get the same answer? Remember how many times you caught that stupid mistake? Remember checking and rechecking that spreadsheet just because something didn’t feel quite right? Remember how good it felt when the world finally made sense?
For every error we catch because something didn’t feel quite right, there are those errors that are not caught because we’re not looking for them. Remember the last time you accidentally bumped into that spreadsheet error “by accident” 5 minutes before that key presentation? Not quite as good a feeling?
So what makes us so diligent at validating our work on some data but “cutting corners” on others? One reason is that it’s human nature to have hypotheses about the performance of our business. Assumptions about what certain initiatives should produce. And when they do, it’s human nature to accept it and move on. Think about that IT manager that completes an enterprise system implementation, and sees immediate results in efficiency. It’s only normal to accept it quickly as truth, and move on. We all have a burning desire to be right, even if the data is just a little too good to be true.
A good friend of mine is an equity trader who trades on technical signals and observations (i.e. he works purely off of chart breakouts and breakdowns). I find it fascinating how he is able to keep such a level head, no matter how good or bad a day he appears to have. You see, most investors measure their success on a “mark to market” basis- essentially judging their success on the increase or decrease in the value of their portfolio at each market close. Most traders are either really enthusiatic, really disgusted, or neutral at the end of each day. However, my friend has an uncanny ability to always stay neutral. He knows that if he has a 10% gain in his mark to market numbers, it is likely an anomaly. He knows the data and performance patterns so well, that if and when such a condition occurs, he knows that there is something well beyond his trading savvy that has likely driven it. His first order of business is to make sure his calculations and assumptions are correct- that the extraordinary gain or loss did in fact occur. If so, he accepts the windfall or extraordinary loss, knowing that the anlomoly will likely be corrected in the future by an offsetting experience that will bring his portfolio back to reality. Of course, there may be times where he has changed something in his methodology that has yielded the improvement. But its only after challenging his initial data and initial impressions that he will ever draw that kind of conclusion.
So my message for today is to avoid “kneejerk” reactions to performance data, even if, on the surface, it appears easily explainable. Validate it, challenge it, and stress test your observations. You’ll build a stronger reputation of data integrity, and your successes and “wins” will no doubt be sweeter.
Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at email@example.com