By David Loshin

Last week I shared some thoughts about a current client and the mission to justify the development of a data quality program in an organization that over time has evolved into one with distributed oversight and consequently very loose enterprise-wide controls.

The trick to justifying this effort, it seems, is to address some of the key issues impacting the usability of data warehouse data, as many of the downstream business users often complain about data usability, long times to be able to get the data for their applications, and difficulty in getting the answers to the questions they ask.

The issues mostly center on data inconsistency, timeliness of populating the analytical platform, completeness of the data, the rampant potential for duplication of key entity data due to the numerous data feeds, long times to figure out why the data is invalid, and general inability to satisfy specific downstream customer needs.

Because all of these issues are associated with the context of day-to-day reporting and analysis, we have inferred that the common theme is operational efficiency, and that is the dimension of value that we have chosen as the key for establishing the value proposition. Therefore, our business justification focuses on the data quality management themes that would resonate in terms of improving operational efficiency:

  • Improve proactive detection of invalid data prior to loading into the data warehouse
  • Speed the effort to finding and analyzing the source of data errors
  • Make the time to remediate data issues more predictable
  • Better communicate data errors to the data suppliers
  • Provide business-oriented data quality scorecards to the data users

The goal is to provide reporting with creditable statistics that demonstrate the efficiency of validating data, finding errors and fixing them, and delivering measurably high-quality data.





Leave a Reply

Your email address will not be published. Required fields are marked *