By David Loshin
But in many cases there is still an issue of coordination. While
standardizing the approaches to monitoring for data validity helps reduce the
effort and complexity of analyzing and remediating issues, there is still the
situation that the same error may impact multiple data consumers; if each data
consumer reports to issue to one of the data stewards, you have many stewards
investigating the same problem. So this is where our second suggestion comes in:
instituting methods for coordinating those evaluations.
This is an area in which the data management world can learn lessons from our
friends in desktop or network support, who rely on incident management systems
for the reporting, tracking, and management of issues. Data consumers impacted
by a data error can report the flaw in the incident management and tracking
system, which can assign a unique identifier to the logged issue and then route
it to a specific data steward.
However, by carefully structuring the ways that errors are described when
reported provides hierarchies and organization in a way that facilitates
assignment of issues to those stewards with the greatest corresponding
In other words, issues can be grouped to reduce the amount of replicated effort.
In turn, an incident tracking system for data quality issues also provides entry
points for tracking the status of the issues – whether the root cause has been
identified, if a correction has been performed, or if further evaluation is
In the next post: Using data quality rules as a proactive measure.