UPDATED 09:00 EST / JANUARY 26 2016

NEWS

Study finds data quality problems plague N. American companies

A new report paints a pretty dismal picture of the state of data quality in North American enterprises, even as data increasingly becomes a critical strategic asset.

The State of Enterprise Data Quality: 2016 report, which was prepared by 451 Research LLC and commissioned by Blazent Inc. found that just 40 percent of C-level executives and data scientists are “very confident” in the quality of the data used by their organizations, and that this is having a direct business impact. In fact, 65 percent said that up of half of business value can be lost due to poor data quality, and nearly one-third said the value decreases by more than 50 percent.

Big data quality problems can result from a wide variety of conditions ranging from data entry errors to incompatibilities between source databases. Basic snafus such as missing or contradictory information can frustrate salespeople when trying to identify new sales opportunities or close deals. In the age of big data analytics, however, the consequences can be more dire when errors are magnified by analytical calculations that can lead managers to make poor decisions.

There was also concern about the completeness of data used in analytics modeling, with with 57.5 percent of respondents expressing some doubts about whether the data they use had been aggregated prior to cleansing. Analytics often involves aggregating data from multiple sources.

Time is running out to fix the problem. Ninety-five percent of the 200 executive-level respondents to the survey said they expect data sources and volumes to triple over the next 12 months. And there is some evidence that many organizations whistling past the graveyard; as 82 percent said their organization thinks the quality of its data is better than it really is.

Rearview mirror

The most common way organizations discover data quality errors is after the fact; 44.5 percent of respondents said they find errors by scanning reports after the data is captured. More than one third use manual techniques to cleanse data, the process that will become increasingly impractical as volumes grow.

And who gets stuck holding the bag? The IT organization, which is responsible for keeping data clean at nearly 80 percent of the companies surveyed, despite having little control over how it is captured and qualified at the point of origin.

“IT departments have become burdened with the task of employing multiple technologies to compensate for the fact that the responsibility for data quality is generally not assigned to those directly involved with its capture,” wrote 451 Group analysts Carl Lehman, Krishna Roy and Bob Winter.

Executives see a lot of promise in machine learning, which should be music to Blazent’s ears. Fifty-six percent want to apply machine learning technology to data quality management within the next two years.

The 451 Group authors recommend that organizations create and enforce policies for all means of data capture and entry across an organization, holding the systems and people that capture data responsible for its quality. It also recommends that data quality management tools, techniques and services “need to be rationalized and standardized to enable a combination of data cleansing and integration.”

Image by Geralt via Pixabay

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU