Close Close
Popular Financial Topics Discover relevant content from across the suite of ALM legal publications From the Industry More content from ThinkAdvisor and select sponsors Investment Advisor Issue Gallery Read digital editions of Investment Advisor Magazine Tax Facts Get clear, current, and reliable answers to pressing tax questions
Luminaries Awards
ThinkAdvisor

Life Health > Running Your Business

Poor Data Quality: A $600 Billion Issue

X
Your article was successfully shared with the contacts you provided.

An independent software research organization has issued a report concluding that poor data quality is costing U.S. businesses more than $600 billion annually, yet most executives are unaware of its dangers.

According to the Seattle-based Data Warehousing Institute, poor quality customer data costs U.S. businesses $611 billion a year in postage, printing and staff overhead. “Frighteningly,” the report notes, “the real cost of poor quality data is much higher. Organizations can frustrate and alienate loyal customers by incorrectly addressing letters or failing to recognize [customers] when they call or visit a store or Web site.”

Despite these dangers, however, “most executives are oblivious to the data quality lacerations that are slowly bleeding their companies to death,” states the report, titled Data Quality and the Bottom Line. The report also found that nearly 50% of survey respondents had no plans to implement measures to improve data quality.

The reports findings were based on survey results from 647 respondents–primarily U.S.-based information technology managers and staff–across a broad range of industries. About 11% of respondents were in financial services companies, while another 9.5% were in insurance.

According to Wayne Eckerson, director of education and research for the Data Warehousing Institute, the definition of data quality is simply removing errors found in data. Such errors are often the result of data entry mistakes, such as transposing letters or numbers. Data are also often wrong due to changes over time, such as when a person dies, he notes.

“Its also how you interpret the data,” Eckerson points out. Within the same organization, different divisions may have different definitions for the same data elements or terms. Classic examples, he says, are what constitutes a “sale,” or how “gross profits” are calculated.

“The problem with data is that its quality quickly degenerates over time,” the report states. “Experts say 2% of records in a customer file become obsolete in one month because customers die, divorce, marry or move. In addition, data entry errors, systems migrations and changes to source systemsgenerate bucket loads of errors.

“Truthfully, though, most data quality problems come up with the name and address data fields,” says Eckerson. He characterizes the cost of mislabeled, misprinted communications as “astounding.”

The report cites a “real-life” insurance example in which a carrier receives two million claims per month with 377 data elements per claim. “Even at an error rate of .001%, the claims data contains more than 754,000 errors per month and more than 9.04 million errors per year,” the report notes. “If the insurance company determines that 10% of the data elements are critical to its business[it] must still fix almost one million errors each year.”

Further, the report notes, if the insurer estimates $10 per error in staff time needed to make corrections for erroneous payouts, etc. (a “conservative” estimate), “the companys risk exposure to poor quality claims data is $10 million a year.”

The report adds that the estimated risk in the example does not include the firms exposure to poor data quality in financial, sales, human resources, decision support and other operations.

What can a business do to reduce its risk? While there are some technology tools that will help, Eckerson says the key is to be committed to keeping data clean on a continuous basis. “A lot of companies put a Band-Aid on and they get by,” but it becomes a problem when it affects a high-profile project or operation, he notes.

“Data is as critical a resource as a companys assets and money,” he states. For that reason, “companies need to standardize processes for managing data.”

The report recommends that firms “treat data as a strategic corporate resource, develop a program for managing data quality with a commitment from the top, and hire, train or outsource experienced data quality professionals to oversee and carry out the program.”

Commercial data quality tools and service bureaus that automate the process of auditing, cleaning and monitoring data may be worth the investment, the report adds. “Most commercial tools are now moving beyond auditing and scrubbing name and address data to tackle other data types. They are also beginning to step up to the challenge of validating company-specific business rules.”

The bottom line, says Eckerson, is to embrace “total quality management for data,” which means stopping data errors at their source.

The Data Warehousing Institute is an independent association involved in education, research and training for data warehousing and business intelligence professionals.


Reproduced from National Underwriter Life & Health/Financial Services Edition, March 18, 2002. Copyright 2002 by The National Underwriter Company in the serial publication. All rights reserved.Copyright in this article as an independent work may be held by the author.


Copyright 2002 by The National Underwriter Company. All rights reserved. Contact Webmaster


NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.