MSA Example: Attribute or Categorical Data

Slides:



Advertisements
Similar presentations
TWO STEP EQUATIONS 1. SOLVE FOR X 2. DO THE ADDITION STEP FIRST
Advertisements

Fundamentals of Probability
Bellwork If you roll a die, what is the probability that you roll a 2 or an odd number? P(2 or odd) 2. Is this an example of mutually exclusive, overlapping,
1. International Module – 503 Noise: Measurement & Its Effects Day 5.
Steps in the Scientific Method
Art Foundations Exam 1.What are the Elements of Art? List & write a COMPLETE definition; you may supplement your written definition with Illustrations.
QUALITY CONTROL TOOLS FOR PROCESS IMPROVEMENT
Chapter 3 Introduction to Quantitative Research
Chapter 3 Introduction to Quantitative Research
Copyright © 2011, Elsevier Inc. All rights reserved. Chapter 5 Author: Julia Richards and R. Scott Hawley.
Copyright © 2011, Elsevier Inc. All rights reserved. Chapter 4 Author: Julia Richards and R. Scott Hawley.
1 Copyright © 2010, Elsevier Inc. All rights Reserved Fig 2.1 Chapter 2.
1 Copyright © 2013 Elsevier Inc. All rights reserved. Chapter 38.
1 Chapter 40 - Physiology and Pathophysiology of Diuretic Action Copyright © 2013 Elsevier Inc. All rights reserved.
By D. Fisher Geometric Transformations. Reflection, Rotation, or Translation 1.
Lesson 8 Data Toss.
Business Transaction Management Software for Application Coordination 1 Business Processes and Coordination.
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
0 - 0.
ALGEBRAIC EXPRESSIONS
DIVIDING INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
MULTIPLYING MONOMIALS TIMES POLYNOMIALS (DISTRIBUTIVE PROPERTY)
ADDING INTEGERS 1. POS. + POS. = POS. 2. NEG. + NEG. = NEG. 3. POS. + NEG. OR NEG. + POS. SUBTRACT TAKE SIGN OF BIGGER ABSOLUTE VALUE.
MULTIPLICATION EQUATIONS 1. SOLVE FOR X 3. WHAT EVER YOU DO TO ONE SIDE YOU HAVE TO DO TO THE OTHER 2. DIVIDE BY THE NUMBER IN FRONT OF THE VARIABLE.
SUBTRACTING INTEGERS 1. CHANGE THE SUBTRACTION SIGN TO ADDITION
MULT. INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
Addition Facts
ALGEBRAIC EXPRESSIONS
ZMQS ZMQS
Test plans. Test Plans A test plan states: What the items to be tested are At what level they will be tested What sequence they are to be tested in How.
ABC Technology Project
© S Haughton more than 3?
Twenty Questions Subject: Twenty Questions
Linking Verb? Action Verb or. Question 1 Define the term: action verb.
Do you have the Maths Factor?. Maths Can you beat this term’s Maths Challenge?
Lets play bingo!!. Calculate: MEAN Calculate: MEDIAN
Past Tense Probe. Past Tense Probe Past Tense Probe – Practice 1.
Time Measurement Training 1. Time Measurement Objective: By the end of this lesson you will be able to identify process elements and record the time associated.
IAEA Training in Emergency Preparedness and Response Development of Simulation Exercise Work Session (Drill) Module WS-012.
1 First EMRAS II Technical Meeting IAEA Headquarters, Vienna, 19–23 January 2009.
Addition 1’s to 20.
25 seconds left…...
Test B, 100 Subtraction Facts
Week 1.
We will resume in: 25 Minutes.
1 Unit 1 Kinematics Chapter 1 Day
Statistically-Based Quality Improvement
Repeatability & Reproducibility
Engineering Management Six Sigma Quality Engineering Week 5 Chapters 5 (Measure Phase) Variable Gauge R&R “Data is only as good as the system that measures.
Measurement Systems Analysis
What is Data ? Data is a numerical expression of an activity. Conclusions based on facts and data are necessary for any improvement. -K. Ishikawa If you.
Chapter 11 Quality Control.
Short Course on Introduction to Meteorological Instrumentation and Observations Techniques QA and QC Procedures Short Course on Introduction to Meteorological.
© ABSL Power Solutions 2007 © STM Quality Limited STM Quality Limited Measurement Systems Analysis TOTAL QUALITY MANAGEMENT M.S.A.
INT 506/706: Total Quality Management Lec #8, Gauge R & R Studies.
Success depends upon the ability to measure performance. Rule #1:A process is only as good as the ability to reliably measure.
Exact and Inexact Numbers. In scientific work, numbers are groups in two categories: exact numbers and inexact numbers. An exact number is a number that.
Chapter 5 Errors In Chemical Analyses Mean, arithmetic mean, and average (x) are synonyms for the quantity obtained by dividing the sum of replicate measurements.
Paper Cutting Exercise
Gage Repeatability and Reproducibility (R&R) Studies
Measurement Systems Analysis Introduce Measurement Systems Assess Measurement Systems Performance Understand Measurement System Variation.
Chapter 21 Measurement Analysis. Measurement It is important to define and validate the measurement system before collecting data. –Without measurement.
Prof. Indrajit Mukherjee, School of Management, IIT Bombay
Section 5 Control Charts. 4 Control Chart Applications Establish state of statistical control Monitor a process and signal when it goes out of control.
DSQR Training Attribute MSA
1 2 3 INDIAN INSTITUTE OF TECHNOLOGY ROORKEE PROJECT REPORT 2016
Uncertainty and Error
Chapter 11 Quality Control.
Prepared By: Mr. Prashant S. Kshirsagar (Sr.Manager-QA dept.)
Presentation transcript:

MSA Example: Attribute or Categorical Data MSA for Attribute or Categorical Data MSA Example: Attribute or Categorical Data

MSA Operational Definitions MSA for Attribute or Categorical Data Accuracy: Overall agreement of the measured value with the true value (which may be an “expert” value). Bias plus precision. Attribute Data: Discrete qualitative data. Attribute Measurement System: Compares parts to a specific set of criteria and accepts the item if the criteria are satisfied. Bias: A systematic difference from the true value. Revealed in the differences in averages from the true value. Precision: Variation in the measurement process. R&R: Repeatability and Reproducibility. Two elements of precision. Repeatability: The variation observed when the same operator measures the same item repeatedly with the same device. Reproducibility: The variation observed when different operators measure the same parts using the same device, sometimes it can be the same operator using different devices. Definitions Accuracy: Overall agreement of the measured value with the true value (which may be an “expert” value). Bias plus precision. Attribute Data: Discrete qualitative data. Attribute Measurement System: Compares parts to a specific set of criteria and accepts the item if the criteria are satisfied. Bias: A systematic difference from the true value. Revealed in the differences in averages from the true value. Precision: Variation in the measurement process. R&R: Repeatability and Reproducibility. Two elements of precision. Repeatability: The variation observed when the same operator measures the same item repeatedly with the same device. Reproducibility: The variation observed when different operators measure the same parts using the same device, sometimes it can be the same operator using different devices. The list provides a quick reference for key terms used in Measurement System Analysis. MSA for Continuous Processes 2 .PPT

The Fundamental MSA Question MSA for Attribute or Categorical Data “Is the variation (spread) of the measurement system too large to study the current level of process variation?” + = (Observed Variability) Total Variability Product Variability Process Variability Variation in the measurement process Quantifying Variation Like all processes, the measurement process has CTQs. The graph above lists some of the most common CTQs used for the measurement process. MSA quantifies the amount of variation for: Accuracy Repeatability Reproducibility Stability (this is typically covered in the Black Belt workshop) Linearity (covered in Black Belt workshops) MSA for Continuous Processes 3 .PPT

Possible Causes of Bias MSA for Attribute or Categorical Data True Value or Standard Bias Observed Average Possible Causes of Bias Sensor not properly calibrated Improper use of sensor Unclear procedures Human limitations Bias Bias is the difference between the observed average of measurements and the true average. Validating accuracy is the process of quantifying the amount of bias in the measurement process. Experience has shown that bias and linearity are typically not major sources of measurement error for continuous data, but they can be. In service and transaction applications, evaluating bias most often involves testing the judgment of people carrying out the measurements. Example A team wants to establish the accuracy of its process to measure defects in invoices. First, they gather a “standard” group of invoices and have an “expert” panel establish the type and number of defects in the group. Next, they have the standard group of invoices measured by the “normal” measurement process. Differences between averages the measurement process came up with, and what the known defect level was from the expert panel represented the bias of the measurement process. MSA for Continuous Processes 4 .PPT

Possible Causes of Poor Repeatability MSA for Attribute or Categorical Data Repeatability Possible Causes of Poor Repeatability Equipment Gage instrument needs maintenance The gage needs to be more rigid People Environmental conditions (lighting, noise) Physical conditions (eyesight) Repeatability Repeatability is the variation in measurements obtained when one operator uses the same measurement process for measuring the identical characteristics of the same parts or items. Repeatability is determined by taking one person, or one measurement device, and measuring the same units or items repeatedly. Differences between the repeated measurements represent the ability of the person or measurement device to be consistent. Possible causes of the lack of repeatability are listed on the slide. MSA for Continuous Processes 5 .PPT

Possible Causes of Poor Reproducibility MSA for Attribute or Categorical Data Reproducibility Mean of the measurements of Operator B of Operator A Possible Causes of Poor Reproducibility Measurement procedure is not clear Operator is not properly trained in using and reading gage Operational Definitions not established Reproducibility Reproducibility is very similar to repeatability. The only difference is that instead of looking at the consistency of one person, you are looking at the consistency between people. Reproducibility is the variation in the average of measurements made by different operators using the same measurement process when measuring identical characteristics of the same parts or items. Possible causes of poor reproducibility include: measurement process is not clear, operator not properly trained in using the measurement system, and operational definitions are not clear nor well established. MSA for Continuous Processes 6 .PPT

Attribute Measurement Systems Study MSA for Attribute or Categorical Data Discrete qualitative data Go/no-go basis; or limited data categories Compares parts to specific criteria for accept/not accept or to be placed in category Must screen for effectiveness to discern good parts from bad At least two appraisers and two trials each If available, have Quality Master rate parts first Listed here are the key highlights of conducting a MSA for attribute or categorical data. The “parts” can be invoices, parts or reason codes for customer returns, for example. MSA for Continuous Processes 7 .PPT

Attribute MSA Example Attribute MSA Study MSA for Attribute or Categorical Data Attribute MSA Example Appraiser A Appraiser B Master 1 2 1 2 1 G G G G G 2 G G G G G 3 G NG G G G 4 NG NG NG NG NG 5 G G G G G 6 G G G NG G 7 NG NG NG G NG 8 NG NG NG G G 9 G G G G G 10 G G G G G 11 G G G G G 12 G G G G G 13 G G NG G G 14 G G G G G 15 NG G G G G 16 G G G G G 17 G G G G G 18 G G G G G 19 G G G G G 20 NG G G G G G = Good NG = Not Good This shows the results of 2 rounds using 2 appraisers, assessing the same 20 items. MSA for Continuous Processes 8 .PPT

Challenges of Continuous Process MSA MSA for Attribute or Categorical Data MSA study is an experiment Requires two or more trials for calculating Repeatability Needs a way to present the inspection units to the appraiser multiple times Is not possible within the continuous process When conducting an MSA for a continuously running process, parts should be taken off-line to conduct the MSA study. MSA for Continuous Processes 9 .PPT

Case Example: Visual Inspection of Glass MSA for Attribute or Categorical Data Catwalk Glass Inspector Cutter Packers The example given here is that of visual inspection of glass. MSA for Continuous Processes 10 .PPT

Case Example: Challenges to Overcome MSA for Attribute or Categorical Data Bias to the standard could be evaluated on-line. Repeatability and Reproducibility (R & R) could not be evaluated on-line. A method had to be devised to allow the inspectors to view the same pieces of glass repeatedly. The solution was an off-line conveyor which simulated the on-line condition as closely as possible. MSA for Continuous Processes 11 .PPT

Case Example: Attribute MSA Method Employed MSA for Attribute or Categorical Data 20 pieces of glass from the process that included both good and bad samples were selected. A team of people well versed in the quality standard classified each piece of glass as either “pass” or “fail.” All regular inspectors independently evaluated each piece twice (in random order). The inspectors used a log sheet to record the data. Minitab® was used to analyze the data. There were two outcomes in this inspection or measurement process: pass or fail. Twenty pieces, a team of inspectors, and two rounds (or trials) were used in the MSA. MSA for Continuous Processes 12 .PPT

Case Example: Attribute MSA Study Data MSA for Attribute or Categorical Data Excerpt of full data for 20 inspectors This slide shows the data in MINITAB®. The “Standard” column documents the correct or expert answer for each piece of glass. MSA for Continuous Processes 13 .PPT

Case Example: Attribute MSA Study Results MSA for Attribute or Categorical Data The graph on the left shows the agreement (or repeatability) of each appraiser between Trial 1 and Trial 2. The graph on the right shows the agreement of each appraiser with the Standard. As shown on both graphs, the blue dots show the percent agreement and the redlines are the 95% confidence intervals. MSA for Continuous Processes 14 .PPT

Case Example: Attribute MSA Study Results (continued) MSA for Attribute or Categorical Data This slide shows the Within Appraiser agreement. For example, Larry scored 100% - Trial 1 and Trial 2 are in full agreement. On the other hand, Allen scored only 50% agreement – there is only a 50% agreement between his Trial 1 and Trial 2 measurements. Or he disagrees with himself half the time! MSA for Continuous Processes 15 .PPT

Case Example: Attribute MSA Study Results (continued) MSA for Attribute or Categorical Data This slide shows the Agreement of each Appraiser (across both trials) with the Standard. For example, Larry has 89% agreement with the Standard but, Allen only has 39% agreement with the Standard. MSA for Continuous Processes 16 .PPT

Case Example: Attribute MSA Study Results (continued) MSA for Attribute or Categorical Data This shows the level of agreement across all Appraisers. In this case, only 5.56% agreement! MSA for Continuous Processes 17 .PPT

Case Example: Attribute MSA Study Conclusions MSA for Attribute or Categorical Data What could have caused the poor agreement? What was done to improve consistency? Given the results of the MSA study, what could have caused the poor agreement? And what should be done to improve the measurement system? The measurement system must be improved and tested again (with another MSA study) to reach at least 90% agreement before the data can be used for base-lining process performance or further analysis. MSA for Continuous Processes 18 .PPT