The Biggest Sin is the Sin of Omission
Blog Administrator | Uncategorized |
By Elliot King
People of a certain age may remember the advertising campaign by an American
manufacturer that asserted that “the quality goes in before the name goes on.”
While the manufacturer ultimately went the way of all the companies who actually built
televisions in the United States, the advertising message was compelling. A product’s ingredients count.
If information is an organization’s most important asset–and it is–the data is the critical component in its production. And the quality of your data should go in before your name goes on your information, so to speak. But even those companies who make a commitment to improving their data quality make errors in the way they implement their data quality programs. Some of those errors are trivial but some can have a dramatic impact on the entire initiative.
The following are among the most common errors companies make in implementing data quality programs:
• Searching for one answer to data quality–Improving data quality is complex and involves people, processes and products. No single piece of technology can solve all your data quality issues.
• Focusing on performance instead of quality–Many companies may live with a certain level of problems with their data because the outcomes of the processes in which the data is used are acceptable. For example, if the returns of a direct marketing campaign are good, an organization may overlook the poor quality of its lists.
• Assuming source data is accurate–Data flows into most organizations from several sources. The safe assumption is that data from virtually every source have some sort of inconsistencies with it.
• Being reactive instead of proactive–Too often, companies wait until a data quality issue actually infringes or impedes on a critical business process before they begin to address the problem.
But in the data quality arena, the biggest sin of all is ignoring the issue and doing nothing at all. Surprisingly, few companies have staff responsible for addressing data quality issues on an ongoing basis and that is a major problem.