Presentation is loading. Please wait.

Presentation is loading. Please wait.

Understanding Data Quality

Similar presentations


Presentation on theme: "Understanding Data Quality"— Presentation transcript:

1 Understanding Data Quality

2 What is data quality?

3 Definition Quality data are accurate depictions of the real word that are consistent across an enterprise, secure and accessible, delivered in a timely manner, and suitable for their intended applications (Redman, 2001). The state of completeness, consistency, timeliness and accuracy that makes data appropriate for a specific use (Government of British Columbia). Data quality institutionalizes a set of repeatable processes to continuously monitor data and improve data accuracy, completeness, timeliness and relevance (Holly Hyland and Lisa Elliott).

4 Dimensions Accuracy Completeness Consistency Utility/Validity
Timeliness Security Accessibility National Center for Education Statistics (NCES), US.

5 Dimensions Accuracy Completeness Consistency Utility/Validity
Timeliness Security Accessibility The data represent the truth. The best up front tool for data accuracy is a “single, exhaustive data dictionary.” The data dictionary must be published, understood, and used. This is the definitive source for data elements that will include data definitions, formats, codes lists, formats for each type of data and restrictions on values or ranges

6 Dimensions Accuracy Completeness Consistency Utility/Validity
Timeliness Security Accessibility All required elements are reported.

7 Dimensions Accuracy Completeness Consistency Utility/Validity
Timeliness Security Accessibility Everyone who handles the data shares an understanding of the data and their definitions.

8 Dimensions Accuracy Completeness Consistency Utility/Validity
Timeliness Security Accessibility The data provide the right information to answer the questions that are asked.

9 Dimensions Accuracy Completeness Consistency Utility/Validity
Timeliness Security Accessibility Quality data are accessible to users at the correct time in order to provide information for decision- making.

10 Dimensions Accuracy Completeness Consistency Utility/Validity
Timeliness Security Accessibility Quality data are secured to protect privacy and to prevent tampering.

11 Dimensions Accuracy Completeness Consistency Utility/Validity
Timeliness Security Accessibility Data quality results from data use. Data must be available to authorized staff to improve decision making.

12 Understanding of data handling

13 Understanding of data handling
Read this passage. How many processes have you noticed? What are the processes involved? How data is handled in each process?

14 The first stage in data analysis is the preparation of an appropriate form in which the relevant data can be collected and coded in a format suitable for entry into a computer; this stage is referred to as data processing. The second stage is to review the recorded data, checking for accuracy, consistency and completeness; this process is often referred to as data editing. Next, the investigator summarizes the data in a concise form to allow subsequent analysis—this is generally done by presenting the distribution of the observations according to key characteristics in tables, graphs and summary measures. This stage is known as data reduction. Only after data processing, editing and reduction should more elaborate statistical manipulation of the data be pursued.

15 Data handling is the process of ensuring that data is stored, archived or disposed off in a safe and secure manner during and after completion of any program/project. This includes the development of policies and procedures to manage data handled electronically as well as through non-electronic means

16 Proper planning for data handling can result in
efficient and economical storage, retrieval, and disposal of data.

17 In the case of data handled electronically, data integrity is a primary concern to ensure that recorded data is not altered, erased, lost or accessed by unauthorized users.

18 Issues that should be considered in ensuring integrity of data handled include the following:
Type of data handled and its impact. Type of media containing data and its storage capacity, handling and storage requirements, reliability, longevity, retrieval effectiveness, and ease of upgrade to newer media. Data handling responsibilities/privileges, that is, who can handle which portion of data, at what point during the program/project, for what purpose, etc. Data handling procedures that describe how long the data should be kept, and when, how, and who should handle data for storage, sharing, archival, retrieval and disposal purposes.

19 Data quality dimensions in the literature
include dimensions such as accuracy, reliability, importance, consistency, precision, timeliness, understandability, conciseness and usefulness Wand and Wang (1996: p92)

20 Kahn et al. (1997) developed a data quality framework based on product and service quality theory, in the context of delivering quality information to information consumers.

21 Four levels of information quality were defined:
sound information, useful information, usable information, and effective information. The framework was used to define a process model to help organisations plan to improve data quality.

22 A more formal approach to data quality is provided in the framework of Wand and Wang (1996) who use Bunge’s ontology to define data quality dimensions. They formally define five intrinsic data quality problems: incomplete, meaningless, ambiguous, redundant, incorrect.

23 Semiotic Theory Semiotic theory concerns the use of symbols to convey knowledge. Stamper (1992) defines six levels for analysing symbols. These are the physical, empirical, syntactic, semantic, pragmatic and social levels.

24 Data quality could be emphasize on these levels:
Concern with physical and physical media for communications of data Physical - Empirical - Syntactic - concerned with the structure of data Semantic - concerns with the meaning of data Pragmatic - concerns with the usage of data (usability and usefulness) Social - concerns with the shared understanding of the meaning of the data/information generated from the data

25 Data Quality: How good is your data?
This is an example of data quality perceived by a company that producing GPS

26 Precision or Resolution
Scale ratio of distance on a map to the equivalent distance on the earth's surface Primarily an output issue; at what scale do I wish to display? Precision or Resolution the exactness of measurement or description Determined by input; can output at lower (but not higher) resolution Accuracy the degree of correspondence between data and the real world Fundamentally controlled by the quality of the input Lineage The original sources for the data and the processing steps it has undergone Currency the degree to which data represents the world at the present moment in time Documentation or Metadata data about data: recording all of the above Standards Common or “agreed-to” ways of doing things Data built to standards is more valuable since it’s more easily shareable

27 DISCUSSIONS Discuss the strategies for ensuring quality data in all the categories listed in the table according to levels given in the context of educational settings or institutions.

28 Semiotic Level Goal Dimension Improvement Strategy Syntactic Consistent Well-defined (perhaps formal) syntax Semantic Complete and Accurate Comprehensive, Unambiguous, Meaningful, Correct Pragmatic Usable and Useful Timely, Concise, Easily Accessed, Reputable Social Shared understanding of meaning Understood, Awareness of Bias

29 Semiotic Level Goal Dimension Improvement Strategy Syntactic Consistent Well-defined (perhaps formal) syntax Corporate data model, Syntax checking, Training for data producers Semantic Complete and Accurate Comprehensive, Unambiguous, Meaningful, Correct Training for data producers, Minimise data transformations and transcriptions Pragmatic Usable and Useful Timely, Concise, Easily Accessed, Reputable Monitoring data consumers, Explanation and visualisation, High quality data delivery systems, Data tagging Social Shared understanding of meaning Understood, Awareness of Bias Viewpoint analysis, Conflict resolution, Cultural Immersion

30 4 Common Data Challenges Faced During Modernization:
Data is fragmented across multiple source systems - Each system holds its own notion of the policyholder. This makes developing a unified user-centric view extremely difficult. The situation is further complicated because the level and amount of detail captured in each system is incongruent.

31 4 Common Data Challenges Faced During Modernization:
Data formats across systems are inconsistent - When organization operating with systems from multiple vendors and each vendor has chosen to implement a custom data representation. In order to respond to evolving business needs, this led to a dilution of the meaning and usage of data fields: the same field represents different data, depending on the context.

32 4 Common Data Challenges Faced During Modernization: (Cont.)
Data is lacking in quality - When organization has units that are organized by line of functions. Each unit holds expertise in a specific field and operates fairly autonomously. This has resulted in different practices when it comes to data entry. The data models from decades-old systems weren’t designed to handle today's business needs.

33 4 Common Data Challenges Faced During Modernization: (Cont.)
Systems are only available in defined windows during the day, not 24/7 - If the organization's core systems are batch oriented. This means that to make updates are not available in the system until batch processing has completed. Furthermore, while the batch processing is taking place, the systems are not available, neither for querying nor for accepting data. Another aspect affecting availability is the closed nature of the systems: They do not expose functionality for reuse by other systems.

34 Lack of Centralized Approach Hurting Data Quality
“Data quality is the foundation for any data-driven effort, but the quality of information globally is poor. Organizations need to centralize their approach to data management to ensure information can be accurately collected and effectively utilized in today’s cross-channel environment.” Thomas Schutz, senior vice president, general manager of Experian Data Quality


Download ppt "Understanding Data Quality"

Similar presentations


Ads by Google