We concur with Evan Levy of Baseline Consulting Group argument that data quality should be an “entrenched” process, as data is continually morphing and changing. Read his article “Perfect Data and other Data Quality Myths” below.

 

A recent client experience reminds me what I’ve always said about data quality: it isn’t the same as data perfection. After all, how could it be? A lot of people think that correcting data is a post-facto activity based on opinion and anecdotal problems. But it should be an entrenched process.

One drop of motor oil can pollute 25 quarts of drinking water. But it’s not the same with data. On the other hand, an average of less than 75 insect fragments per 50 grams of wheat flour is acceptable.

People forget that the definition of data quality is data that’s fit for purpose. It conforms to requirements. You only have to look back at the work of Philip Crosby and W. Edwards Demming to understand that quality is about conformance to requirements. We need to understand the variance between the data as it exists and its acceptability, not its perfection.

To read the full article, click here.

http://www.melissadata.com/enews/articles/12102010/1.htm

Leave a Reply

Your email address will not be published. Required fields are marked *