Glossary
Data quality
Data quality is a measure of the condition of data, based on areas including accuracy, completeness, timeliness, consistency and reliability.
The data produced by organizations can create value for all. But this data must be reliable if people are to trust and use it. So, how do you ensure data quality? And why is it important?
Data quality: what are the issues?
Maintaining a high level of data quality is absolutely imperative for organizations.
According to a study conducted by McKinsey, employees spend an average of 30% of their time on tasks that add no value to the organization. This is because of data which is of poor quality and is difficult to access. High quality data therefore increases efficiency.
Beyond saving time, data quality is critical to creating value. Unreliable and outdated information increases the risk of poor, badly-informed decision making and holds back productivity and data sharing.
Moreover, sharing inaccurate data externally (with customers, partners, citizens and competitors) damages brand reputation and can lead to legal issues.
How do you improve the quality of data?
In the age of big data, the challenge for organizations is to improve and guarantee the quality of the data they collect. These key actions aid data quality:
Data governance
Governance involves creating a data use and management policy whose purpose is to:
- Comply with legal obligations
- Validate the data
- Ensure consistency between different datasets
- Protect confidential information
- Ensure data quality.
Data governance is increasingly important because organizations now have huge volumes of data, available in a wide variety of formats and stored in multiple locations. This makes it difficult to efficiently manage all the information available and ensure it can be shared across the organization.
Employees in charge of governance must therefore define frameworks, processes, and methodologies to make data more accessible, more reliable, more relevant and higher quality.
Data processing
Along with data governance, it is essential to continually strive for quality and excellence within data management.
For this reason, effective data management is based on the following actions:
- Appoint data stewards: data stewards are responsible for managing the quality of the data.
- Define a framework: define the rules for data collection, analysis, storage, use, classification, protection, access, etc.
- Define quality criteria: this includes, for example, data sources, their frequency of updating, their retention period, etc.
- Establish a methodology: cover all the different steps in the process; from preparation to end use, including data integration. Defining this methodology should help anticipate potential problems.
- Adapt: in order to guarantee high-quality data at all times, it is essential to perform regular analyses. In the event of issues, it will be necessary to investigate the datasets and make changes to improve their quality.
Management tools
To efficiently utilize the large amounts of data available today, having the right tools is essential. These tools can automate data processing, thereby limiting the time spent and reducing the risk of error.
However, it is essential to use high-performance solutions that focus more on the quality than the quantity of the data being processed.
For example, the Opendatasoft platform has more than 50 processors to automate data quality. For example, you can replace text, create a geographical point, pseudonymize data (for greater confidentiality), normalize a date, etc. And you can do all this without writing a single line of code.
You can also access the ODS Data Hub to cross-reference your data with over 28,000 public datasets maintained by our teams to enrich your databases.
Data culture
While data quality is supported by the use of technology, it is vital to understand that data is primarily used by human beings.
It is therefore crucial to educate teams on the importance of data, particularly in terms of how it is collected, processed, analyzed and protected.
This data culture has to involve and educate everyone, not just data scientists, data analysts and other data experts. This is vital as operational teams (such as sales, product, marketing, and operations) and senior management also need to use the data. Therefore, every department should be part of improving data quality.
Metadata
Metadata is generally defined as data which describes other data. The idea is to create a context for your data by answering the basic questions: who, what, where, when, how and why?
This allows you to simplify access to your data by first offering a summary of its content that can be accessed by both humans and technology tools.
High-quality data requires high-quality metadata to describe it, that follows specific rules and formats to ensure consistency and discoverability.
Learn more
Product
Opendatasoft integrates Mistral AI’s LLM models to provide a multi-model AI approach tailored to client needs
To give customers choice when it comes to AI, the Opendatasoft data portal solution now includes Mistral AI's generative AI, alongside its existing deployment of OpenAI's model. As we explain in this blog, this multi-model approach delivers significant advantages for clients, their users, our R&D teams and future innovation.