By Elliot King
Rumsfeld had the heavy burden of safeguarding the security of the United States
on his shoulders. And while not nearly as weighty, data quality specialists have
the responsibility of safeguarding the integrity of their organizations’ data.
For them, the easy set of questions is the known unknowns.
Since data quality is generally assessed by five or six characteristics, data
quality specialists should constantly be asking themselves questions that relate
to those characteristics. For example, is the data your company uses correct and
complete? Do you have enough data to achieve your business objectives? Is your
data being updated at appropriate intervals (or asked another way, what is the
rate of data quality decay)? And so on.
Unfortunately, depending on how the data quality program within an organization
is structured, many of those responsible for data quality live in a world of
unknown unknowns. For example, what is the source of data decay? If you import
data from external sources, what is the quality control mechanism in place at
the source? What new initiatives does you organization have in mind that might
have an impact on overall data quality? What new business processes may be
planned that will have an impact on data quality? Is your data quality program
flexible enough to accommodate sudden changes?
Finding the answers to the known unknowns of data quality should be an integral
part of any data quality program. Trying to recognize what you don’t even know
that you don’t know requires imagination and perseverance. And trying to
understand that broader horizon can keep people up at night.