The Top Mistakes Companies Make with their Data
ChatGPT wanted me to make sure I told you about “Executive Sponsorship” and “Change Management” and the usual platitudes and drivel. Gee, I wonder where it learned that?
Sure, these things are true… Master Data Governance does rely on compliance, and people indeed are more compliant if they’re on board with why we’re doing it. And, I’m guessing that 9 out of 10 readers wouldn’t have made it this far if there wasn’t some element of executive sponsorship in place.
Let’s hit on some hard, practical considerations when it comes to your Master Data. Here are three avoidable mistakes that companies consistently make.
- They’re not measuring the right stuff… operationally. As goes the old adage, “Begin with the end in mind.” If you’re not measuring the stuff that actually helps you optimize operations across the full range of business domains, then there’s a good chance you’re not collecting the right data. If you’re not collecting the right data, then does it really matter how clean it is?
This is not “data for data’s sake.” There is an ongoing cost to reliable master data. While it would be nice (and perhaps one day feasible) to collect everything, that’s far from practical for most businesses. For a metric to have business value, it must be timely, reliable (in that it’s hard to “cheat”), and informative. The right metrics go a long way to defining the policies and processes more generally associated with the discipline of Master Data Governance.
- They don’t systematically triage their data points. Every data point has a cost. As goes the old adage, “Let the hopeless die first.” Yes, the cost of storage, retrieval, processing and other computational tasks is at an all-time low. But, the cost of eliciting personnel compliance is at an all-time high. If you need to remind yourself of the true cost of eliciting compliance, think about trying to get your salespeople to update CRM correctly.

Systematic triage tells us where to focus what level of effort. If it’s easy to collect and of high value – new bookings for example – collect it and ensure it’s bulletproof.
If it’s more costly to collect and consume, but is of high value, consider some amount of pre-processing in order to minimize the cost of consumption. An example of this would be one of my favorite metrics: what % of our deals in stage W converted to stage X within timespan Y during period Z? I won’t go into why this metric is so valuable (much of its value lies in its reliability), but it turns out to be one of the more difficult metrics for which to collect data. I have found that the best time to prepare such data for consumption is close to the point of collection – waiting until you need to compile it for the monthly ops review only increases the cost of consumption.
- They look at each information technology system in isolation. As goes the old adage, “Can’t see the forest for the bark.” Multiple systems will carry the same data point. For example, your customer’s address is likely to exist in your CRM as well as your ERP. Not only does sound Master Data Governance suggest that these two systems should agree, but we may find that one system is more likely than the other to be reliable. We may find that the cost of collection is lower for one system than it is for the other. Relative reliably and relative cost of collection is going to drive decisions on both policy and whether and how to automate Master Data Governance-related functions.
While these are all common mistakes, avoiding them goes a long way towards the hard work of developing sound Master Data Governance policies that intelligently trade off compliance with standard operating procedures with automation.