Though our tests in the Nordic Regions it shows that organizations and businesses have from 5-30 percent duplicates in their databases. What is the price for the duplicates?
In an article in DM Review Thomas C. Redman comes with this assesment of the cost of bad data:
“Consider first the cost of efforts to find and fix errors. While organizations do, from time to time, conduct massive clean-up exercises, most efforts to find and fix errors are embedded in day-in and day-out work. Over the years, we developed the Rule of Ten: If it costs $1.00 to complete a simple operation when all the data is perfect, then it costs $10.00 when it is not (i.e., late, hard to interpret, incorrect, etc.).”
In my example I will use the price of 1 DKK pr record and 10 DKK for incorrect data. I will use the conservative 5% duplicate.
Cost of poor data
In this solution I have used the rentalprice of Omikron AddressCenter. With rental you can deduct the whole cost in the operating costs, whereas if you buy the solution it will be in the investment costs.