Data Quality problems in the Norwegian State Church.

BOrgund Stavkirke 1

I just read an article in the Norwegian paper Dagbladet.

In Norway there is a State Church. It means it is sponsored by the State and a vast majority (about 85%) of the citizens is member of it.

In the last weeks the Norwegian Church as sent out 3,12 mill election cards to Norwegians over 15 years of age.  The problem is that many members have elected to leave the church and select not to be a member of become members of other churches or organizations.  There seem to be a problem with the registrations of the ones who have left.

This has severe financial implications since you get support for each member. The other churches can miss this income. It might not sound as much – but when there is an estimated that more than 100.000 Norwegians is wrongly enrolled in the state church. For the organization Human Etisk Forbund – it is a loss of about 3,4 Mill NOK

It has also created hard feelings since this is an important issue for a lot of people.

There is no explanation of this Data Quality Failure, if it is Data Entry problems, cleansing problems or just pure sloppiness.    I am just waiting for them to ask for the heavenly powers the posess to solve the problem.

Increase your revenue by 66% with Data Quality

I stumbled on this interesting article in destinationCRM.com. It covers research from SeriousDecisions about how best practices in Data Quality can boost revenue by 66%.  Best practice in Data Quality has earlier been proven to be a key to success when you implement MDM solutions.

I have tried to show the cost of poor Data Quality, and it is good to show the benefit of optimal Data Quality.

There are several key areas where superior data management can have discrete benefits, according to the report. These follow the SiriusDecisions Demand Creation Waterfall methodology:

  • From inquiry to marketing-qualified lead: It’s most cost-effective to manage data at this early stage, rather than let flawed information seep through the organization. A data strategy that solves conflicts at the source can lead to a 25 percent increase in converting inquiries to marketing-qualified leads.
  • From marketing-qualified lead to sales-accepted lead: Bad source data is compounded by the use of multiple databases and formats, leading to distrust of marketing’s work by sales. Unifying the data, whether into one database or by using technology for virtual integration, can lead to a 12.5 percent uplift in conversion rates to the next stage.
  • From sales-accepted lead to sales-qualified lead: Scoring becomes important at this stage, as the sales team goes to work on the leads it can use — and returns others to the marketing team for further nurturing. Clean data can reduce by 5 percent the time spent conducting the kind of additional research that precedes initial contact with a prospect.
  • From sales-qualified lead to close: The benefits seen between sales qualification and close magnify those accumulated during the previous stages, as salespeople continually update the status and disposition of the potential customers. “Given that the average field-marketing function spends no more than 10 percent of its budget in support of this final conversion, accurate data is a must for applying the right tools and resources to the right audience at the right stage of the buying cycle,” Block writes. A single system of record to keep marketing and sales on the same page — cultivated by timely updates by all involved parties — is critical.

The impact of these abstract concepts — the true value of data management — becomes quite clear as soon as real numbers are applied: From a prospect database of 100,000 names, an organization utilizing best practices will have 90,000 usable records versus a typical company’s 75,000; at every stage thereafter, the strong company has a larger pool of prospects with a higher probability of closing. In the end, SiriusDecisions can show 66 percent more revenue for the company with high-quality data management.

This shows me that the Data Quality Firewall and the new concepts I introduced in September 2008, is the best way to optimize the data. The earlier you detect and correct poor data, the higher your revenue will be.

Survey shows you can realize 70% more revenue based on Data Quality

Just before Christmas a interesting survey was released by SeriousDecision, a source for business-to-business sales and marketing best practice research and data.

They really confirms some of my views on the importance of good data quality in the sales process.

1st confirmation:
The cost of poor data is not taken serious at the senior management

“Most b-to-b marketing executives lament the status of their databases, but have difficulty convincing senior management of the gravity of the problem,” notes Jonathan Block, SiriusDecisions senior director of research . Mr. Block continues, “The longer incorrect records remain in a database, the greater the financial impact. This point is illustrated by the 1-10-100 rule: It takes $1 to verify a record as it’s entered, $10 to cleanse and de-dupe it and $100 if nothing is done, as the ramifications of the mistakes are felt over and over again.”

I have stressed this in several of my earlier posts:, as well as described both the 1 in 10 rule and 1-10-100 rule

2nd confirmation:
The problem is tremendous.

“Focusing on b-to-b sales and marketing best practices, the firm has found that from 10 to 25 percent of b-to-b marketing database contacts contain critical errors — ranging from incorrect demographic data to lack of information concerning current status in the buying cycle.”

By the way, the tests we have done in the Nordics show the there is from 5% to 35% errors in the databases, the average is 16,7% errors.

3rd Confirmation:
Ongoing cleansing is more important than the one time approach.

“Organizations must shift their focus from one-time data cleansing to ongoing data maintenance to turn the tide,” says Mr. Block. “The good news is that we’re seeing a strategic shift in approach in strong organizations, from one of data cleansing (a project with a set completion date) to data maintenance (ongoing policies and procedures to maintain data quality). The fundamental trouble with one-time data cleansing is that the day the project ends, the data is the cleanest it will be until the next round of contacts is added to the database.”

This is commented in this earlier post

4th Confirmation:
The upside is huge

SiriusDecisions also estimates that organizations with an early-phase data strategy can expect a roughly 25 percent uplift in conversion rates between the inquiry and marketing qualified lead stages.

Using an example of a prospect database of 100,000 names at the outset and a constant campaign response rate of two percent, a strong organization will realize nearly 70 percent more revenue than an average organization purely based on data quality. For those marketing executives having problems convincing senior management that a permanent process upgrade rather than ‘quick fix’ will pay big dividends in the long run, this is the kind of eye-opening statistic that should prove invaluable.”

It is interesting to see these kind of estimates, since making easy ROI calculation for Data Quality Projects is difficult.

Another article about ignorance of Data Quality

There is another article in IT-Pro which deals with the ignorance of data quality issues.

Some excerpts that make you think:

Nearly two in every three (58 per cent) of UK executives surveyed said they could not confirm that a documented strategy exists to keep their contact data accurate and up-to-date.

Despite this, nearly all (96 per cent) recognise that inaccurate data has a direct financial impact on their operations, with 19 per cent admitting to it having a negative impact on revenue or funding.

Another interesting factor they mention is this:

Only eight per cent of organisations validate all the information they collect, where 34 per cent validate none of the information they collect and enter into their systems.

A good solution might be to install a proper Data Quality Firewall.

Survey: IT pros ignore poor data quality

There is an article in IT-Pro dealing with how IT-Pro’s ignore poor data quality. The article is about a survey conducted by Information Difference.

Some of the findings:

  • 15 per cent rate their Data Quality as “high or very high”.
  • 21 per cent believe the inaccuracies in data kost them between $10 and $100 million per year
  • 14% thought direct cost because of poor master data were less than $1 million a year.
  • 30% have not deployed a tool to combat the problem.

“We often hear that MDM is a key concern for CIOs, yet in reality it seems there’s still very little actually being done about it,” said Andy Hayler, Information Difference chief executive.

“Particularly worrying is that that over half the senior IT executives we spoke to weren’t consistently sure whose spreadsheet has the correct data,” added Hayler.

I just wonder when it will be taken seriously?

Fight Global Warming with Data Quality

The Direct Mail industry is urged to go green.  Experian is referring to surveys that show if Direct Mail companies wants to reduce their waste and be more cost-efficient, they have to make better use of Data Quality Systems. This will ensure their records are as accurate as possible.

I haven’t found this survey myself, and cannot link to it.

But I believe the survey.  Let’s do the math with an imagined example:

A telephone company sends out 1.000.000 letters in a DM campaign.
They have 5% duplicates/bad records which is 50.000 wasted letters
Each letter weigh 20 grams, so the waste is 1.000 kg of paper
This equals to 24 trees. They do 10 campaigns a year, and waste 240 trees.

What would the economic incentive be for this company to save 240 trees?
50.000 wasted letter ten times a year at a cost of 80 cents?  400.000 Euros in direct costs. In addition you avoid soft costs like lower customer attrition, employee satisfaction, error rework and the list goes on.  Not so bad in these hardening economic times?

From CW: Dirty Data Blights the Bottom Line

I stumbled over this article in ComputerWorld, and just saw the relevance to my post on the sexiness of Data Quality:

Data quality isn’t a glamorous topic, but Companies ignore it — especially for internal systems- at their financial peril.

I advice you to read the article since it includes a good case study, and some very valuable hints on how to go about asucessful DW project.

I feel like I am repeating myself, but hopefully I will wake someone up about Data Quality!