By Elliot King
cases they don’t exist. In IT generally, many times people hope to find some
technology that could serve as a quick fix or a silver bullet. Generating too
much data? Buy additional storage. Problem solved.
To fix data quality problems, however, great technical tools, while necessary,
are never enough. Data quality rests on three pillars–people, processes and
technology. If a data quality improvement program does not cover all three, its
chances of success are greatly diminished.
Over the past several years, it has become clear that data quality is not just
an IT issue, but is a business issue as well. Data is a corporate asset and its
value comes within business contexts. Consequently, data quality must be a
shared responsibility between IT and the business units. IT can provide the
technical knowledge, but the context has to come from business analysts.
Moreover, as with most corporate activities, somebody has to be in charge or
nothing will be done.
Unfortunately, there is no “silver bullet” for data quality. All your data
problems cannot be solved in one quick fix. And even if your data is perfect
today, it won’t be tomorrow. Stuff happens and it is reflected in your data.
Companies must have policies in place to safeguard data quality. For example,
how often should data–and data quality procedures–be audited? What are the
procedures for integrating data from external sources, and so on?
But people and policies are not enough. Most companies just have too much data
to profile, correct and control data manually. The job is just too big.
So for data quality there is no quick fix. But why should there be?