1© M G Gibson 2010RSS Destructive Testing MSA1 ISO/TS 16949:2009(E) and AIAG MSA 4 th edn. (2010) Martin Gibson CStat, CSci, MSc, MBB AQUIST Consulting.

Slides:



Advertisements
Similar presentations
MSA Example: Attribute or Categorical Data
Advertisements

Analysis by design Statistics is involved in the analysis of data generated from an experiment. It is essential to spend time and effort in advance to.
Gage R&R Estimating measurement components
VIII Measure - Capability and Measurement
Measurement Systems Analysis with R&R INNOVATOR Lite TM The House of Quality presents.
12–1. 12–2 Chapter Twelve Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
Establishing the Integrity of Data:
Slide 1 Six Sigma in Measurement Systems: Evaluating the Hidden Factory Scrap Rework Hidden Factory NOT OK OperationInputs Inspect First Time First TimeCorrect.
Chapter 10 Quality Control McGraw-Hill/Irwin
8-1 Is Process Capable ? The Quality Improvement Model Use SPC to Maintain Current Process Collect & Interpret Data Select Measures Define Process Is Process.
02/25/06SJSU Bus David Bentley1 Chapter 12 – Design for Six Sigma (DFSS) QFD, Reliability analysis, Taguchi loss function, Process capability.
L Berkley Davis Copyright 2009 MER301: Engineering Reliability Lecture 16 1 MER301: Engineering Reliability LECTURE 17: Measurement System Analysis and.
CSUN Engineering Management Six Sigma Quality Engineering Week 4 Measure Phase.
The Quality Improvement Model
Engineering Management Six Sigma Quality Engineering Week 5 Chapters 5 (Measure Phase) Variable Gauge R&R “Data is only as good as the system that measures.
Limitations of Analytical Methods l The function of the analyst is to obtain a result as near to the true value as possible by the correct application.
Measurement Systems Analysis
Statistical Process Control
Chapter 11 Quality Control.
SOFTWARE PROJECT MANAGEMENT Project Quality Management Dr. Ahmet TÜMAY, PMP.
Module 13: Gage R&R Analysis – Analysis of Repeatability and Reproducibility This is a technique to measure the precision of gages and other measurement.
Measurement System Analysis Kevin B. Craner Boise State University October 6, 2003.
© ABSL Power Solutions 2007 © STM Quality Limited STM Quality Limited Measurement Systems Analysis TOTAL QUALITY MANAGEMENT M.S.A.
QMS Tips and Traps (ISO TS16949)
4. Quality Management System (QMS)
4. Quality Management System (QMS)
University of Florida Mechanical and Aerospace Engineering 1 Useful Tips for Presenting Data and Measurement Uncertainty Analysis Ben Smarslok.
1. 1. Product Control: to prove a particular gage is capable of distinguishing good parts from bad parts and can do so accurately every time. ◦ Ideal.
Measurement System Analysis (MSA) Discussions at CSIR S.A
DataLyzer® Spectrum Gage Management System introduces……
Accuracy and Precision
Success depends upon the ability to measure performance. Rule #1:A process is only as good as the ability to reliably measure.
Introduction to Gage R&R Studies Rahul Iyer, ASQ-CQE Mesa AZ April 2015.
1 LECTURE 6 Process Measurement Business Process Improvement 2010.
Your measurement technology partner for global competitiveness CSIR National Metrology Laboratory The National Metrology Laboratory of South Africa Feedback.
L Berkley Davis Copyright 2009 MER301: Engineering Reliability Lecture 16 1 MER301: Engineering Reliability LECTURE 16: Measurement System Analysis and.
Gage Repeatability and Reproducibility (R&R) Studies
Chapter 36 Quality Engineering (Part 2) EIN 3390 Manufacturing Processes Summer A, 2012.
Process Measurement & Process Capability Variable Data
Measurement Systems Analysis Introduce Measurement Systems Assess Measurement Systems Performance Understand Measurement System Variation.
Statistical Process Control04/03/961 What is Variation? Less Variation = Higher Quality.
1 Exercise 7: Accuracy and precision. 2 Origin of the error : Accuracy and precision Systematic (not random) –bias –impossible to be corrected  accuracy.
Operations Fall 2015 Bruce Duggan Providence University College.
1 The Analysis and Comparison of Gauge Variance Estimators Peng-Sen Wang and Jeng-Jung Fang Southern Taiwan University of Technology Tainan, Taiwan.
MEASURE : Measurement System Analysis
Chapter 21 Measurement Analysis. Measurement It is important to define and validate the measurement system before collecting data. –Without measurement.
DMAIC: Measure Robert Setaputra.
Prof. Indrajit Mukherjee, School of Management, IIT Bombay
BME 353 – BIOMEDICAL MEASUREMENTS AND INSTRUMENTATION MEASUREMENT PRINCIPLES.
R&R Homework Statgraphics “Range Method”. DATA OperatorPartTrialMeasure B B B B B B326.5 B B B C
Overview of Instrument Calibration Presents by NCQC, India.
Chapter 4 Process Measurement.
TQM Defined Total quality management is defined as managing the entire organization so that it excels on all dimensions of products and services that are.
MSA / Gage Capability (GR&R)
1 2 3 INDIAN INSTITUTE OF TECHNOLOGY ROORKEE PROJECT REPORT 2016
MEASURE : Measurement System Analysis
TM 720: Statistical Process Control
Gage R&R Estimating measurement components
This teaching material has been made freely available by the KEMRI-Wellcome Trust (Kilifi, Kenya). You can freely download,
The Certified Quality Engineer Handbook 3rd ed. Ch
سمیه حاتمی محمد روستا مصطفی حسینی تجزيه و تحليل سيستم اندازه‌گيري MSA
The Certified Quality Engineer Handbook 3rd ed. Ch
Chapter 11 Quality Control.
Measuring and Controlling Quality
AIAG MSA Manual MSA Concepts Term Definition Guidelines
The Certified Quality Engineer Handbook 3rd ed. Ch
AIAG MSA 4th Ed. Manual MSA Concepts Defined Explained Guidelines
Measurement System Analysis
Measurement Systems Analysis
Prepared By: Mr. Prashant S. Kshirsagar (Sr.Manager-QA dept.)
Presentation transcript:

1© M G Gibson 2010RSS Destructive Testing MSA1 ISO/TS 16949:2009(E) and AIAG MSA 4 th edn. (2010) Martin Gibson CStat, CSci, MSc, MBB AQUIST Consulting

2© M G Gibson 2014RSS Destructive Testing MSA2 Making sense of MSA  Do you know how accurate and precise your measurement and test equipment are?  Do you suspect that good work is sometimes condemned as bad simply because of uncertainty in the measurement system; is bad work ever released as good?  Do you know the cost of non-capable measurement systems?  Do you realise how important it is to understand measurement systems uncertainty?  Does your auditor share your understanding of measurement systems?  What can you do about it?

3 ISO/TS 16949:2009(E) Measurement System Analysis  Statistical studies shall be conducted to analyse the variation present in the results of each type of measuring and test equipment system. ... applies to measurement systems in the control plan. ... analytical methods and acceptance criteria used shall conform to those in customer reference manuals on MSA.  Other analytical methods and acceptance criteria may be used if approved by the customer. © M G Gibson 2014RSS Destructive Testing MSA3 Questions:  What is the operational definition of statistical studies?  Do organisations, auditors, quality mangers understand statistical studies?  Why do auditors ask?, “Can you show me GR&R studies for each type of measuring and test equipment system referenced in the control plan?”

4 ISO/TS Scheme Update IF SMMT Webinar, 5 Nov Common problems found in ISO/TS16949 audits  Calibration and MSA (7.6 and 7.6.1)  Definition of Laboratory scope  Control of external laboratories  Traceability to national or international standards  MSA not done for all types of measuring systems  MSA only considering gauge R and R © M G Gibson 2014 RSS Destructive Testing MSA 4 Questions: 1. Why is MSA regarded as GR&R?

5 Ford Motor Company MSA requirements (2009) 4.35 (ISO/TS cl ) All gauges used for checking Ford components/parts per the control plan shall have a gauge R&R performed in accordance with the appropriate methods described by the latest AIAG MSA to determine measurement capability.  Variable gauge studies should utilize 10 parts, 3 operators & 3 trials  Attribute gauge studies should utilize 50 parts, 3 operators & 3 trials © M G Gibson 2014RSS Destructive Testing MSA5 Questions: 1. Are some Customers leading the thinking? 2. Why just limited to products? 3. What are your Customer expectations?

6© M G Gibson 2014RSS Destructive Testing MSA6 Measurement System Variation Reproducibility Repeatability Accuracy Stability Precision Bias Linearity Gauge R&R Calibration Measurement System Variation Observed Variation = Process Variation + Measurement System Variation

7 AIAG MSA 4 th edn. (2010)  Accuracy, Bias, Stability, Linearity, Precision, Repeatability, Reproducibility, GR&R  Attributes, Variables, & non-replicable data considered  Variables GR&R study  10 parts, 3 operators, 3 measurements  Parts chosen from 80% of tolerance  Destructive testing requires 90 parts from a homogeneous batch  Three analytical methods: 1. Range – basic analysis, no estimates of R&R 2. Average & Range – provides estimates of R&R 3. ANOVA – preferred, estimates of parts, appraisers, parts*operators interaction, replication error due to gauge © M G Gibson 2014RSS Destructive Testing MSA7 Question: 1. Do organisations, auditors, quality mangers understand MSA?

8© M G Gibson 2014RSS Destructive Testing MSA8 AIAG ANOVA Models  Crossed vs. Nested  Y ijk = μ + Operator i + Part j + (Operator*Part) ij + ε k(ij)  Y ijk =  + Operator i + Part j(i) +  (ij)k  Crossed vs. Nested?  See Barrentine, Moen, Nolan & Provost, Bower, Burdick, Skrivanek  Fixed vs. mixed effects models?  Software?  MTB V16+ includes fixed, mixed effects, enhanced models, pooled standard deviation approach not included.  SPC for Excel – fixed effects  Other software packages? Question:  Do organisations, auditors, quality mangers understand ANOVA?

9  % Contribution  Measurement System Variation as a percentage of Total Observed Process Variation using variances (additive)  % Study Variation  Measurement System Standard Deviation as a percentage of Total observed process standard deviation (not additive)  % Tolerance  Measurement Error as a percentage of Tolerance  Number of Distinct Categories (ndc)  Measures the resolution of the scale  % Contribution  % Study Variation  % Tolerance  ndc  < 1% Good  2-9% Acceptable  > 9% Unacceptable  < 10% Good  11-30% Acceptable  > 30% Unacceptable  < 10% Good  11-30% Acceptable  > 30% Unacceptable  > 10 Good  5-10 Acceptable  < 5 Unacceptable GR&R Variables Data Acceptance Criteria  Do organisations, auditors, quality mangers understand the metrics?

10 Non-replicable GR&R case study (Anon, 2002)  Ensure that all the conditions surrounding the measurement testing atmosphere are:  defined, standardized and controlled  appraisers should be similarly qualified and trained  lighting should be adequate and consistently controlled  work instructions should be detailed and operationally defined  environmental conditions should be controlled to an adequate degree  equipment should be properly maintained and calibrated, failure modes understood, etc.

11 Non-replicable GR&R case study (Anon, 2002)  If the overall process appears to be stable & capable, and all the surrounding pre-requisites have been met, it may not make sense to spend the effort to do a non-replicable study since the overall capability includes measurement error – if the total product variation and location is OK, the measurement system may be considered acceptable.  Ironically high Cp / Cpk gives poor ndc! Question: 1. Do organisations, auditors, quality mangers understand this concept? AIAG FAQs response:  If your process is stable and capable, the spread of this acceptable process distribution includes your measurement error. There may be no need to study your measurement error from a purely "acceptability" viewpoint.’

12© M G Gibson 2014RSS Destructive Testing MSA12 Questions for Making sense of MSA  What is the operational definition of statistical studies?  Do organisations, auditors, quality mangers understand statistical studies?  Why do auditors ask for GR&R studies?  Why is MSA regarded as GR&R?  Are (some) Customers leading the thinking?  Why is MSA limited to products?  Do you know your Customer expectations?  Do organisations, auditors, quality mangers understand?  MSA, ANOVA, crossed vs. nested, fixed vs. mixed models, metrics, high Cp/Cpk gives low ndc?  Is MSA seen just as a QMS requirement or a true part of continuous improvement?

13 References  AIAG Measurement System Analysis, 4 th edn., (2010)  Anon. Non-replicable GR&R case study, (circa 2002)  Barrentine, Concepts for R&R Studies, 2 nd edn., ASQ, (2003)  Bower, A Comment on MSA with Destructive Testing, (2004) ; see also keithbower.com  Gorman & Bower, Measurement Systems Analysis and Destructive Testing, ASQ Six Sigma Forum Magazine, (August 2002, Vol. 1, No. 4)  Burdick, Borror & Montgomery, A review of methods for measurement systems capability analysis; JQT, 35(4): , (2003)  Burdick, Borror & Montgomery, Design & Analysis of Gauge R&R Studies, SIAM, ASA, (2005)  Moen, Nolan & Provost, “using a Nested Design for quantifying a destructive test” in Improving Quality Through Planned Experimentation, McGraw-Hill; 1 st edn., (1991)  Skrivanek, How to conduct an MSA when the part is destroyed during measurement, moresteam.com/whitepapers/nested-gage-rr.pdf

14 Example crossed vs. Nested (5x2x2 for brevity)