IBM recently reported that bad data cost companies in the United States approximately $3.1 billion last year. The is question you should be asking is not “If” your organization has bad data but just how bad is it ! Bad data are everywhere and having effective Data Governance in place helps mitigate the consequences.
If you are interested in trying to measure the quality of your data there are a couple of simple test. The best one I have seen is the “Friday Afternoon Measurement” (FAM) developed by Thomas C Redman, a leading figure in the world of Data Governance. Using this test you can easily quantify the quality of your data based on a representative sample. Typical test results show data are 67% below perfect, which is 100%. Soon I will post a little FAM application which can be downloaded so you can run the test yourself.
It can be quite a shock when you get your results and then do the math to calculate the actually cost of bad data for your organization. According to Thomas Redman ““it costs 10 times as much to complete a unit of work when the input data are defective as it does when they are perfect.” Now it should be a little clearer how we, as a country, could be losing $3.1 trillion in productivity due to bad data.
Data governance is defined as the management of all the data which an organization has to ensure that high data quality exists throughout the complete data lifecycle. If your organization does not have a formal Data Governance strategy its very likely that unresolved bad data are increasing your operating cost significantly.
The interactive map below is a good starting point. Publishing your Data Governance Map as a visualization and sharing it across your enterprise helps to communicate your strategic vision and priorities. Following the DG map are a few sample analysis generated using tools we have developed in support Data Governance. If you are struggling to get your data quality up to standards we would love to help.
ben at alQemy.com