Data quality does not happen by chance or sheer luck. Rather, data quality requires an ongoing plan that addresses current weak points in your data, delivers the desired results, and can be accomplished with available resources.

 

Building a robust data quality plan also should be forward-thinking. It should anticipate future issues, be flexible enough to grow with your organization and scale to your business needs.

 

Here are seven steps that will help you outline a workflow to scalable data quality:

 

1. Acknowledge there is a problem.

 

Before anything can happen, your company must be aware that it has a problem with poor data quality and needs to find a solution that addresses this problem. This requires a reasonably rigorous audit of the costs incurred by inaccurate data.

 

It should include expenses directly related to bad address data (such as USPS or parcel carrier fees,) lost revenue and employee hours spent correcting problems that could be prevented by a data quality initiative.

 

2. Assign a point person.

 

Assign a person or a team that will be responsible for executing your data quality initiative. This person will be in charge of identifying the problem, the possible solutions and bringing these recommendations back to management. After an approach is decided upon, the “point person” will also be in charge of putting the initiative into effect.

 

3. Identify the problem.

 

The first responsibility of the point person will be to identify the root cause of any data quality problems in your organization. This will require a detailed examination of how data quality problems enter the “pipeline” and an evaluation of what would be the most effective point to enforce data quality.

 

4. Perform a technology assessment.

 

Deciding upon an approach to data quality requires a thorough and realistic appraisal of the technology and resources available within the organization. Ask yourself the following questions:

 

  • If data quality is to be enforced via a web site, is it feasible to integrate the technology into the web site or will the site need to be retooled to make it possible?
  • Is the volume of addresses to be verified small enough to make a web service a more economical solution?
  • Does the organization have the programming resources to integrate a data quality tool into an existing application?
  • Does the software used by the organization allow customization or is it a closed system? If it’s a closed system, will a standalone address verification program fill the needs of the organization?
  • What would be the best way to ensure that all address data passes through a data quality process before it is used in any way? What changes to the network might be necessary to make this happen?

 

5. Evaluate vendors.

 

Once the extent of the problem and the available resources has been identified, the point person then identifies possible data quality solutions vendors and evaluates their products. This evaluation should include downloading and using trial versions of their software (if these are available), as well as possibly contacting other users of the same software or organizations similar to your own.

 

6. Implementation.

 

At this stage, the recommended data quality solution is put into practice, possibly on a limited basis, for a trial period. The point person tracks costs and manpower usage to compare with those of a similar period before the data quality initiative.

 

7. Validation.

 

After a trial period, the results are collected and reported back to management. At this point, the effectiveness of the data quality initiative can be evaluated and any adjustments, if necessary, can be made before putting the solution into full-scale use throughout the organization.

 

To read the full report, go to:

http://www.melissadata.com/dqt/whitepaper/scalable-data-quality-whitepaper.pdf

Leave a Reply

Your email address will not be published. Required fields are marked *