Download

The DQ Glossary

TermDefinition
Accuracy The correctness of data, ensuring it reflects the real-world entities it represents.
AuthenticityThe genuineness and trustworthiness of data, indicating that it is original, unaltered, and from a reliable source.
CleansingIdentifying, correcting, or removing errors, inconsistencies, and inaccuracies within a dataset to improve data quality.
CompletenessThe extent to which data contains all necessary information for a particular record.
ConsistencyThe uniformity of data across various systems and sources.
Customer Data Integration (CDI)The process of integrating and managing customer data from multiple sources to create a unified and accurate view.
Customer Identity ResolutionThe process of identifying and linking multiple records related to a single customer across different data sources.
Data AnomaliesUnusual or unexpected patterns, behaviours, or values within a dataset that deviate from the norm.
Data CleansingThe process of correcting or removing inaccuracies and inconsistencies in data.
Data DeduplicationIdentifying and eliminating duplicate records within a dataset.
Data EnrichmentEnhancing existing data with additional information from external sources.
Data GovernanceThe framework and processes for managing data quality, security, and compliance.
Data Governance FrameworkA structured approach to managing, organising, and controlling data to ensure quality, security, and compliance.
Data IntegrationCombining data from different sources to create a unified and comprehensive dataset.
Data Integration ToolsSoftware or tools designed to combine and unify data from different sources, facilitating seamless data flow and analysis.
Data MaskingThe process of obscuring or anonymizing sensitive information within a dataset to protect privacy.
Data Matching SoftwareSoftware designed to identify and match similar or identical records within a dataset, supporting tasks like deduplication.
Data OwnershipThe assignment of responsibility for the accuracy, security, and overall quality of specific datasets.
Data PrivacyThe protection of sensitive and personal information to ensure compliance with privacy regulations.
Data ProfilingThe process of analysing data to understand its structure, quality, and relationships.
Data Profiling SoftwareSoftware that automates the process of analysing and understanding the characteristics, quality, and relationships within a dataset.
Data Profiling ToolsSoftware or tools designed to analyse and understand the characteristics, quality, and structure of data within a dataset.
Data Quality DashboardA visual representation or interface displaying key data quality metrics and insights for monitoring data health.
Data Quality ImprovementThe ongoing process of enhancing data quality through measures like cleansing, validation, and standardisation.
Data Quality MetricsQuantifiable measures used to assess the quality of data, focusing on aspects like accuracy, completeness, and consistency.
Data Quality ScoreA quantitative measure indicating the overall quality of a dataset, often calculated based on various data quality metrics.
Data Quality ToolsSoftware or applications specifically designed to assess, improve, and maintain the quality of data within an organisation.
Data ScrubbingAnother term for data cleansing, involving the identification and correction of errors and inconsistencies.
Data StandardisationThe process of transforming data into a common format, ensuring consistency and uniformity.
Data StewardshipThe responsible management and oversight of data to ensure its quality, security, and compliance.
Data ValidationChecking data against predefined standards or rules to ensure compliance.
Master Data Management (MDM)A method of managing and organising core business entities to ensure consistency and accuracy across an organisation.
MasteringThe process of creating and maintaining a master or authoritative version of data, ensuring a unified and consistent view.
MatchingThe process of comparing and identifying similarities or commonalities between two sets of data. Used in tasks like deduplication and record linkage.
MergingCombining two or more sets of data into a single dataset, typically from different sources, to create a comprehensive view.
ReliabilityThe trustworthiness of data, indicating its accuracy and consistency over time.
Single Customer View (SCV)The comprehensive and unified representation of a customer’s data from various touchpoints and interactions.
ValidityThe degree to which data accurately represents the real-world concept it is intended to measure or describe. Valid data is relevant, meaningful, and correctly reflects the intended attributes or characteristics.
VerificationThe process of checking whether data is accurate and truthful by cross-referencing it with a reliable source or using a standardised method.

Install DQ for Excel within minutes for complete control over your customer data.

This is how easy it is:

Make an Enquiry

Product and account support