One of the age-old problems we encounter as performance managers is one of data reliability. While it should be, intuitively, the most important aspect of performance management, it is, relatively speaking, given much lower priority than its more “sexy” relatives.
ERP’s, data warehouses, analysis engines, web reports…the list goes on. Comparatively speaking, each and every one of these important PM dimensions gets its fair shake of mind space and investment capital. But as the old adage goes, “garbage in/ garbage out” (GIGO). We all know that data quality is a necessary pre-requisite for any of these tools to work as designed. So why is it that so little time and attention goes into cleaning up this side of the street?
Tell me you can’t identify with this picture. You’re sitting in a Senior Management presentation of last quarter’s sales results. Perhaps you’re even the presenter. You get to a critical part of the presentation, which shows a glaring break in a trend which has been steadily improving for months. It signals the obvious- something bad has happened and we need to address it now! Conversation turns to the sales-force, the lead qualification process, the marketing department, competition,… 45 minutes later- no real clarity, except for lots of “to do’s” and follow up commitments.
Fast-forward two weeks (and several man-hours of investment) later. The Sales VP is pummeling one of his sales managers to “step up” the performance, and wants new strategies. A new commission structure is discussed, which brings in the need to get HR and IT involved. A few days later, when working on implementing some of the new strategies, a new story begins to unfold. An IT analyst, deep in the bowls of the organization astutely recognizes THE big missing piece of the puzzle. You see, last month, the manager of the Eastern Region changed the way he wants to see “sales-closes” reported (the way deals are essentially recorded), from one that is based on “client authorizations” to one based on “having the contract in hand”- a very useful distinction, particularly when viewed from a cash flow and accounting perspective. The only problem is that it was applied locally, not corporate wide, resulting in the apparent data anomaly.
Sounds a bit too simple for a modern corporation, well into the technology age. But unfortunately, this kind of story is all too common. We all understand the principles of GIGO, yet it continues to chew up corporate resources unnecessarily.
Overcoming the GIGO problem should be our number one priority- before systems, before reports, before analysis, before debate, and before conclusions are drawn. Before anything else, data quality is THE #1 priority.
Here are a few tactics for getting a solid “data quality” foundation in place:
1. Understand the “cost of waste”-
We measure everything else, why not measure the cost of poor data quality? Take a few of your last GIGO experiences and quantify what the organization wastes on unnecessary analysis, debate, and dialog around seemingly valid conclusions gone awry. This doesn’t have to be complex. Do it on the back of an envelope if you have to. Include everything that goes into it, including all the levels of management and staff that get involved. Then communicate it to your entire PM team. Make it part of your team’s mantra. Data quality matters!
2. Become the DQ (Data Quality) CZAR in your company-
Most performance managers got where they are by exposing that “diamond in the rough”. We got where we are by using data to be an advocate for change. It’s hard to imagine getting executive attention and recognition for something as “boring” as getting the data “right”. But that is what needs to happen. The increased visibility of post-Enron audit departments, SOX initiatives, and other risk management strategies have already started this trend. Performance Managers must follow. You need to embrace DQ as something you and your department “stand for”.
3. Create Data Visibility-
In some respects, this has already begun, but we have to do more. Our IT environments have the potential of disseminating information to every management level and location within minutes of publishing it. But let’s go one step further. Let’s “open the book” earlier in the process so more of those who can spot data issues earlier can participate in the game. What I’m saying here is that people have different roles when it comes to performance management. Some are consumers, and some are providers. It’s just as important to create visibility for the input factors, as it is to publish those sexy performance charts. You’ll get the input of that 4th level IT analyst I discussed above, much earlier in the process.
4. Utilize External Benchmarks Where Possible-
Benchmarks are often used within organizations to set targets, justify new projects, defend management actions, and to discover new best practices. These are all good and noble reasons to benchmark. One of the most overlooked benefits of benchmarking, however, is the role it plays (or should play) in your DQ process. I can’t tell you how many meetings I’ve been in where the presence of an external benchmark highlighted a key problem in data collection. Sometimes, seeing your data compared against a seemingly erroneous metric, can show major breakdowns in the data in cases where they would have otherwise gone undetected. Using comparisons to highlight reporting anomalies can be a very valuable use of external benchmarks.
5. Establish a DQ process-
It would be nice if all data were collected in an automated manner, where definitions could be hard-coded, and “what to include” would never be in question. But in most companies, that is simply not the case. Our research has shown that over 50% of data used in our performance management process is still collected manually. But very few of these companies have a defined and auditable process for doing so. This does not have to be complicated, as there are some very useful tools emerging that help collect, validate, approve, and publish required data, just as there are for data reporting and score-carding. Having a process, and system to ensure that process is followed, are both critical elements in data collection, and hence make for very good investments.
6. Don’t forget the Culture –
As I said above, most data, for the time being, will be collected in a manual fashion without fancy IT infrastructure. People will still be at the heart of that process. Invest time in helping them see the importance of the information they are collecting, how that information will be used, and what process will be followed to do so. Many organizations spend tens of millions on a systems solution to what is largely a people/ cultural problem. Investing in training and coaching can be as high payback as those mega systems investments.
So as you navigate through your internal data collection efforts, try and keep these tips in mind. Sometimes, it’s the simple “blocking and tackling” that can make the difference between winners and those in second place.
Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at firstname.lastname@example.org