Summary

Improving the quality of the data is a constant battle. The challenges range from the definition of good quality to the actual data correction. Disputes will exist when creating standards, which can be followed by technical difficulties when automating their conformance verification and rectification.

Error prevention is certainly the ideal scenario, but cannot always be achieved. Companies are constantly dealing with existing data, which is suddenly susceptible to new rules, new structures, and new referential integrity constraints. Furthermore, there is the typical lag between requirements and implementation, leading to a transitional period of adaptation to new business processes and system changes. All these disruptions can cause further corruption of the data in addition to the necessary correction of previous problems and preparation for eventual new practices.

Adding to the mix is the incorporation and assimilation of data quality tools. An effective data quality management program requires powerful and flexible technology. Still, as much as data quality tools have improved lately, a large amount of human intervention is still necessary in some areas.

The good news is there is hope for this perceived chaos. As companies mature from a data quality perspective, it is possible to better deal with the constant maintenance struggles. People, process, and technology maturity will certainly minimize the bumps on the road and create an environment supported by a metrics ...

Get Master Data Management in Practice: Achieving True Customer MDM now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.