Survey reiterates disconnect between data trust and data quality
By subscribing, you agree to receive communications from Auto Remarketing and our partners in accordance with our Privacy Policy. We may share your information with select partners and sponsors who may contact you about their products and services. You may unsubscribe at any time.
PEARL RIVER, N.Y. –
Syncsort recently conducted a survey to explore data quality and organizations’ confidence in data across their enterprise.
Though most respondents rated their organization’s data quality either as good (38%) or very good (27%), the software company’s survey revealed a disconnect around understanding and trust in the data and how it informs business decisions.
Syncsort reported on Wednesday that 69% of respondents stated their leadership trusts data insights enough to inform business decisions, yet they also said only 14% of stakeholders had a very good understanding of the data.
Of the 27% who reported sub-optimal data quality, 72% said it negatively impacted business decisions.
The survey determined the top three challenges companies face when ensuring high-quality data are multiple sources of data (70%), applying data governance processes (50%) and volume of data (48%).
Survey findings noted about three quarters (78%) have challenges profiling or applying data quality to large data sets.
Subscribe to Auto Remarketing to stay informed and stay ahead.
By subscribing, you agree to receive communications from Auto Remarketing and our partners in accordance with our Privacy Policy. We may share your information with select partners and sponsors who may contact you about their products and services. You may unsubscribe at any time.
About 29% of participants said say they have a partial understanding of the data that exists across their organization, while 48% said they have a good understanding.
Syncsort found that fewer than 50% of respondents take advantage of a data profiling tool or data catalog.
Instead, respondents shared that rely on other methods to gain an understanding of data, with more than 50% using SQL queries and more than 40% using a BI tool.
Of those who reported partial, minimal or very little understanding of their data, the top three attributes respondents lacked visibility into were:
— Relationship between data sets (63%)
— Completeness of data (56%)
— Validation of data against defined rules (56%).
Of the survey participants who reported fair or poor data quality, wasted time was the No. 1 consequence (92%), followed by ineffective business decisions (72%) and customer dissatisfaction (67%).
Syncsort went on to mention 25% of respondents who reported sub-optimal data quality say it has prevented their organization from adopting emerging technology and methods, such as artificial intelligence, machine learning and blockchain.
Only 16% of respondents are confident they aren’t feeding bad data into artificial intelligence and machine learning applications.
The survey also revealed 73% are using cloud computing for strategic workloads, but 48% of them have partial to no understanding of the data that exists in the cloud. A total of 22% rate the quality of their data in the cloud as fair or poor.
“This survey confirms what we’ve been seeing with our customers — that good data simply isn’t good enough anymore,” Syncsort chief technology officer Tendü Yoğurtçu said. “Sub-optimal data quality is a major barrier, especially to the successful, profitable use of artificial intelligence and machine learning.
“The classic phrase ‘garbage-in, garbage-out’ has long been used to describe the importance of data quality, but it has a multiplier effect with machine learning — first in the historical data used to train the predictive model, and second in the new data used by that model to make future decisions,” Yoğurtçu continued.