Software supermarkets or specialized vendors?

[tweetmeme source=”jeric40” http:// optimizeyourdataquality.wordpress.com]

Some customers are worried about the complexity of the data quality challenges. Understandable, since they can hear about a lot of projects that has failed. There are several reasons for this. An important factor is using right tool. If you want to get a nut to loosen from a bolt, you might use several tools.

Some use pliers, almost a sure failure.  Others use an adjustable wrench.  This might work, but if it is not tight enough, there is a big possibility that the nut will be rounded and suddenly will become very hard to get off. It will create a lot of extra work, and the bolt cannot be reused.  If you use the correct wrench, the nut will come off, and it can be reused.

Let me transfer this to the Data Quality world.  Way back I was approached by a prospect.  They had a MDM project had hit the wall.  They had spent over a year so far.  They had used all the products from the Microsoft Enterprise Supermarket, and built some elaborate business rules together with external consultants.  In this way they had found about 23% of duplicates.  Their problem was that they could see there were more, but they could not catch them.  They sent the data over, and a couple of days later we could send the results back, with an additional 27% of duplicates on the cleansed data.  This shows me the power of specialized tools.

This is the result from Google when I searched for Data Quality Tools

There are some great software supermarkets out there.  These often offer excellent and good products.  Customers often want as few vendors as possible, and one stop shopping if possible.  You will get special competence on this vendor’s product, and can be cost efficient.  One challenge with this strategy is that you might miss out on a special product that can be critical for your success. An example: Gartner estimates that 50% of CRM/ERP installations fail, due to poor data quality and integration.

Data might be the most important asset in your company; don’t you want the best available product to handle this precious asset?  If I had the king over for dinner, and I knew that it was a 50% chance of failure if I used the meat from the supermarket,  I would definitely go to the best butcher in town.  90% of the purchase would be in the Supermarket and the lat 10 from specialized vendor.  One thing does not exclude the other.

These challenges is not only in the Data Quality field, but is as much present in the e-Commerce world.

Setting up a Front-end Data Quality Firewall

[tweetmeme source=”jeric40” http:// optimizeyourdataquality.wordpress.com]

In a project with an international vendor some years ago, I introduced the concept of splitting the Data Quality Firewall (DQF) in a Frontend and Backend Data Firewall. These terms are spreading and I get question on how you should set up the Frontend DQF. Last query was just this week via Twitter. My focus is not on the technical side, but the usability and reward for operatives and companies.

Why is the Frontend DQF important?

I participated in the Information Quality Conference in London, where it was stated that 76% of poor data is created in the data entry phase. Be proactive in the data entry phase, instead of being reactive (sometime, if ever) later will help you a long way to good and clean data.

Elements of the Frontend DQF.

First identify in which systems data are created. It may be in a variety of systems like CRM, ERP, Logistics, Booking, Customer Care just to mention a few.

Error tolerant intelligent search in Data Entry systems.

Operatives have been taught by Google and other search engines to go directly to the search box to find information. When you search in Customer Entry systems, it is very often you do not find the customer. In order to this you need error tolerance and intelligence in your search functionality, as well as the suggestion feature. This will help you find the entry despite of typos, different spellings, hearing differences and sloppiness. This will be the biggest contributor to cleaner data. A spinoff is higher employee and customer satisfaction due to more efficient work.

If you want to learn more about error tolerance and intelligent search, read these posts:
Making the case of error tolerance in Customer Data Quality
Is Google paving the way for actual CRM/ERP Search?
Checklist for search in CRM/ERP systems.

Data Correction Registration Module

If you did not find the customer and the operatives have to enter the data, you have to make sure the data entered is accurate. You can install a module or workflows that checks and correct the information.

Most CRM systems will only find the customer if you do exact searches

Check against address vendors
If you have a subscription with an address vendor, you can send the query to them, and they can supply you with the most recently updated data. You can set up so it is easy for the operative, and the data will be correctly formatted to your systems.

This is quite easy for one country. If you are an international company, the laws and regulations are different from country to country. In addition the price can run up if you want local address vendors in several countries. It is important that you registration module can communicate with the local vendors, then format and make the entry correctly into your database(s)

Correct the Data Formats

You might choose not to subscribe to online verification by an address vendor. There are still many checks you can do in the data entry phase. You can check:

– is the domain of the e-mail valid?
– is the format of the telephone number correct?
– is the mobile number really a mobile number?
– is the salutation correct?
– is the format of the address correct?
– is the gender correct?

Example of registration module

Check for unwanted scam and fraud

You can check against:

– internal black lists
– sanction lists
– “Non Real Life Subjects”

Duplicate check

Even though duplicates should have been found in the search, you should do an additional duplicate check, when the entry is done.

If you incorporate these solutions, you should be able to control that the data you enter is clean and correct. It should be possible to get it from one vendor.  Then you can use the Backend DQF to ensure the cleansing of detoriating existing data.

Will Retailers succeed with e-Commerce because of Data Quality?

Trends from the Nordics are confirmed in England. Retailers with physical stores are the fastest growing in the online shopping sector.

They have been a little slow to roll out, because of challenges like cannibalization, logistics and branding. I know that several chains are on the move to launch large worldwide shopping sites. In my opinion they will be highly successful and pass the early movers on the net. My basis for this is that they take the challenge of Data Quality seriously – maybe not by choice but out of pure necessity.

The retailers have multiple points of sales, and multiple point of customer data storage.  Point of sales can be Telephone/Customer Service, Customer Clubs, Physical Stores and customer data storage can be in CRM, ERP and logistics system.  In addition they will have several brands where also the customer data is stored in.  When they then try to do multichannel marketing, it is impossible with the structure they have today.

Data Quality For E-Commerce cluttered

Multichannel seems a little challenging in this picture……

The solution the chains choose is to make a Master Database, with one customer ID – with link to each brand. When you have the Master Data, you can start analyzing for cross selling opportunities. If you are a customer of Brand one, you are also a likely customer of brand 5.  In addition you have tools to filter the information from the multiple points of sales and data. When customer data is entered in one of the point it is checked for duplicates, matched to the right record, checked for fraud. In addition you can set up error tolerant CRM search for the point of sales and enrich the data with Reference data.

Data Quality For E-Commerce clean
Now you are ready for Multichannel Marketing!

The first movers, who often were pure online players, have not had the need for such a rigid Data Quality setup. Where retailers now use professional tools, the pure players still trust their homemade. They will wake up one day and wonder what happened?

Focused vendors get highest customer feedback in new Data Quality Research

The Information Difference Ltd – a British Analyst Firm has made a research about the vendors on the Data Quality Market. You can read the full report here.

Here is the Diagram that Information Difference has made of the DQ Vendors.

The major vendors in the data quality market are described using the Landscape diagram that follows (see later for more on how this is derived; note that due to a modification in the methodology this time around—this version includes customer satisfaction feedback—like-for-like comparisons between this chart and previous versions should be treated with caution).

Data Quality Landscape Q1 2009

Another interesting conclusion in the report is that:

The highest customer feedback scores in this research were for Melissa Data, Datactics and Omikron.

These vendors are focused on solving the DQ challenges at hand. Sometimes so focused so the larger vendors call them Niche vendors. Maybe providing fast and correct result can be as important as delivering largs suites?

Is Data Quality finally getting traction?

I have just attended MDM Summit Europe 2009 – very interesting!

This was about Master Data Management – but single most theme that was discussed was Data Quality.

Statements like: “Data Quality is MDM”,” MDM has to have the best Data Quality” was repeated throughout the Summit.

Can we hope that the issue of Data Quality is finally getting the traction it needs?

We need more of these Data Quality Summits!

Quiz about Data Quality

Are you situated in Norway?  My company is running a quiz about Data Quality until the 15th of may, with a new question every week.  You can win a Washing Machine (since this is what data cleansing, data scrubbing is all about)  or a gift card from Elkjøp of 10.000 NOK.

We do this to put the focus on Data Quality, with capital DQ.

You can access the quiz here:

Increase your revenue by 66% with Data Quality

I stumbled on this interesting article in destinationCRM.com. It covers research from SeriousDecisions about how best practices in Data Quality can boost revenue by 66%.  Best practice in Data Quality has earlier been proven to be a key to success when you implement MDM solutions.

I have tried to show the cost of poor Data Quality, and it is good to show the benefit of optimal Data Quality.

There are several key areas where superior data management can have discrete benefits, according to the report. These follow the SiriusDecisions Demand Creation Waterfall methodology:

  • From inquiry to marketing-qualified lead: It’s most cost-effective to manage data at this early stage, rather than let flawed information seep through the organization. A data strategy that solves conflicts at the source can lead to a 25 percent increase in converting inquiries to marketing-qualified leads.
  • From marketing-qualified lead to sales-accepted lead: Bad source data is compounded by the use of multiple databases and formats, leading to distrust of marketing’s work by sales. Unifying the data, whether into one database or by using technology for virtual integration, can lead to a 12.5 percent uplift in conversion rates to the next stage.
  • From sales-accepted lead to sales-qualified lead: Scoring becomes important at this stage, as the sales team goes to work on the leads it can use — and returns others to the marketing team for further nurturing. Clean data can reduce by 5 percent the time spent conducting the kind of additional research that precedes initial contact with a prospect.
  • From sales-qualified lead to close: The benefits seen between sales qualification and close magnify those accumulated during the previous stages, as salespeople continually update the status and disposition of the potential customers. “Given that the average field-marketing function spends no more than 10 percent of its budget in support of this final conversion, accurate data is a must for applying the right tools and resources to the right audience at the right stage of the buying cycle,” Block writes. A single system of record to keep marketing and sales on the same page — cultivated by timely updates by all involved parties — is critical.

The impact of these abstract concepts — the true value of data management — becomes quite clear as soon as real numbers are applied: From a prospect database of 100,000 names, an organization utilizing best practices will have 90,000 usable records versus a typical company’s 75,000; at every stage thereafter, the strong company has a larger pool of prospects with a higher probability of closing. In the end, SiriusDecisions can show 66 percent more revenue for the company with high-quality data management.

This shows me that the Data Quality Firewall and the new concepts I introduced in September 2008, is the best way to optimize the data. The earlier you detect and correct poor data, the higher your revenue will be.

“Standardizing and normalizing disparate customer data is akin to having root canal surgery at the dentist.”

I found another interesting article on SearchDataManagement.com about why Data Quality is elusive at most organizations.

Most interesting points according to Gartner:

  • Half of the companies have actually deployed data quality tools or started Data Quality initiatives.
  • Of those who use data quality tools, less than one third has deployed these enterprise-wide.

The reason is that the information is siloed in different databases collected from different sources, with no tools to connect them together.  Or as Leslie Ament, Managing Partner, Hypatia Research says:

“Many larger retailers have upwards of 10 different databases with different schema for collecting customer data,” Ament said. “Standardizing and normalizing this information is akin to having root canal surgery at the dentist.”

I like analogies like that, hopefully we can make customers address the challenges, even though it hurts.  The good thing is that I think we can see more business is becoming increasingly aware of these challenges.

Read the whole article here.

10 Critical questions to ask when you implement CRM

I came across an interesting article called: Saving CRM: Creating a data quality program by Douglas Ross.  Mr Ross is Douglas Ross, VP & CTO, Western & Southern Financial Group, so it’s from a users point of view, and not a vendor.  This is a well pointed article I urge everybody to read.

The most interesting part of the article is the list of 10 Questions you should ask before you start the project.  I have listed them here.

1. Have the benefits of improved data quality been defined for and agreed upon by the senior executives in the business?
2. Has the organization defined architectural standards for data, the relationships between data items, and requirements for data usage including those levied by the audit, regulatory, and compliance areas?
3. Does the organization measure data quality and strive for continuous improvement using agreed upon metrics, scorecards, and dashboards?
4. Are data-entry personnel equipped with tools to help enter clean data into the target systems?
5. Are there formal data stewardship roles, and are the related processes well-defined?
6. Do the systems you intend to integrate all support a universal, immutable customer identifier?
7. Do the target systems support all the necessary data elements and the actual relationships between products, people, accounts, and employees?
8. Do the target systems cooperate with one another to maintain data integrity, or do they “fight it out” and overwrite one another’s information from time to time?

9. Has the organization undertaken a bulk cleanup project to cleanse or rationalize legacy data?
10. Does the IT organization understand the benefits of clean data in driving improved business results?

If you answered “no” to nine or more questions, you’re in the same boat with a lot of other organizations.

I hope these questions will be mandatory for future implementations.

Read the article here.

MDM Expert forecast for 2009: Data Quality will be core to delivering effective master data

Bob Karel, the Principal analyst at Cambridge, Mass.-based Forrester Research, has on  SearchDataManagement.com come with his predictions for Master Data Management for 2009.

  • Cross-enterprise MDM adoption will remain extremely rare.
  • Expect a messaging shift from MDM vendors to promote how MDM can mitigate risk.
  • The data quality market will grow as customers recognize it as a cheaper precursor to MDM.
  • MDM market consolidation will continue.

This is why the DQ market will grow:

Data quality management is core to delivering effective master data, and customers who balk at the extremely high start-up costs of MDM software and services will recognize that the more mature data quality market may effectively meet the 80/20 rule of their trusted data requirements — at a fraction of the cost. Operational MDM that bi-directionally synchronizes master data across the enterprise will follow once the value of data quality investments is realized.

This a trend I have experienced for the last 6 months in the Nordics, and I only think it will be stronger as Mr. Karel predicts.