Presentation is loading. Please wait.

Presentation is loading. Please wait.

Common Mistakes and Misconceptions During Data Collection and Analysis

Similar presentations


Presentation on theme: "Common Mistakes and Misconceptions During Data Collection and Analysis"— Presentation transcript:

1 Common Mistakes and Misconceptions During Data Collection and Analysis
Lean Six Sigma DMAIC process: Common Mistakes and Misconceptions During Data Collection and Analysis Hans Vanhaute 04/08/2014

2 Goal of tonight’s presentation
Give you a few examples of common mistakes made during “Measure” phase of DMAIC projects. Draw more widely applicable lessons and conclusions that may benefit you (so you don’t make the same mistakes). Hopefully provide you with some interesting insights (and don’t put you to sleep).

3 DMAIC and “Projects” “A problem scheduled for a solution.”
Management decides the problem is important enough to provide the resources it needs to get the problem solved.

4 “DMAIC Projects” Six Sigma DMAIC Project
Eliminates a chronic problem that causes customer dissatisfaction, defects, costs of poor quality, or other deficiencies in performance. DEFINE MEASURE ANALYZE IMPROVE CONTROL Very Data-Intensive

5 The DMAIC steps M – Measure Define a high-level process map.
Define the measurement plan. Test the measurement system (“Gauge Study”). Collect the data to objectively establish current baseline. Typical tools: - Capability analysis - Gage R&Rs

6 Capability Analysis Conundrums
Initial Analysis Instability of the process over time Y = f (unknown Xs) Black Box Process Unknown Xs Cpk values that inaccurately predict process performance Non-normal data

7 Capability Analysis Conundrums
Case 1: Inherent non-normality of the process output. Some physical, chemical, transactional processes will produce outcomes that “lean” one way: Time measurements Values close to zero, but that are always positive (surface roughness RMS…) Process experts or careful analysis of the metric should be able to help with understanding. Example Good news: Capability Analysis of Non-Normal data is possible. Bad news: This situation doesn’t happen very often.

8 Capability Analysis Conundrums
Case 2: Problematic measurement systems (we’ll come back to that one when we discuss GR&R…)

9 Capability Analysis Conundrums
Case 3: Failure to stratify the data. This is the big one! Stratification is the separation of data into categories. It means to “break-up” the data to see what it tells you. Its most frequent use is when diagnosing a problem and identifying which categories contribute to the problem being solved.

10 Capability Analysis Conundrums
Stream 1 Cpk1 Stream 2 Cpk2 Cpk ??? Stream 3 Cpk3 Stream 4 Cpk4

11 Capability Analysis Conundrums
1 Cpk value? Expected future performance of the process(es) assuming statistical stability over time. What is a Cpk value supposed to tell us? 2 Cpk values?

12 Capability Analysis Conundrums
Over-estimating variation of the process. (Why?) Under-estimating process capability. Leading to all sorts of non-value-added activity for your organization. Recognize two of the four streams are main drivers of overall capability. Correct estimation of the two most important process capabilities. Points to appropriate improvement activities.

13 Capability Analysis Conundrums
Example Actual data (not stratified) Prediction (not stratified) Cpk = 0.67 Actual data (stratified) Prediction (stratified) Cpk = 1.15 Cpk = 1.15 Cpk = 1.50 Cpk = 1.50

14 Capability Analysis Conundrums
Problematic measurement systems: 2a: Limiting factors to “how well” you can measure something. 2b: I passed my GR&R but I’m still getting “weird” results. 2c: Time effects

15 Case 2a: Limits to measurements
Game: Identify the dataset with the highest resolution. Resolution: a: The process or capability of making distinguishable the individual parts of an object, closely adjacent optical images, or sources of light b: A measure of the sharpness of an image or of the fineness with which a device can produce or record such an image.

16 Case 2a: Limits to measurements
Which dataset has the highest resolution? Measurement Resolution: a: The process or capability of making distinguishable the individual parts of a dataset or closely adjacent data points. b: A measure of the sharpness of a set of data or of the fineness with which a measurement device can produce or record such a dataset. ? ?

17 Case 2a: Limits to measurements
Limiting factors to “how well” you can measure something. 1 2 9.9397 9.0401 9.9 10.5 11.4 9.0 Less resolution Less resolution 3 4 9.6 10.2 11.4 9.0 10 11 9 Less resolution

18 Case 2a: Limits to measurements
Limiting factors to “how well” you can measure something. 1 2 Less resolution Less resolution 3 4 Less resolution

19 Case 2a: Limits to measurements
Limiting factors to “how well” you can measure something. 1 2 S = 1.000 S = (0.5% over) Less resolution Less resolution 3 4 S = (4% over) S = (15% over) Less resolution

20 Case 2a: Limits to measurements
Limiting factors to “how well” you can measure something: Why?? “Always done it that way, never given it any thought”. Focus on “meeting specs” not on controlling process. “Always” round to x decimal places. Nobody told me how many decimals were needed The old “1 in 10” rule of thumb seems to make sense. Resolution must be at least 1/10th of data range Resolution must be at least 1/10th of spec range

21 Case 2a: Limits to measurements
Limiting factors to “how well” you can measure something:

22 Case 2b: “Weird” Stuff I passed my GR&R but I’m still getting “weird” results. GR&R 101: “Metrics” P/TV ratio expresses the total measurement variability as a percentage of the total historical process variation. Here P/TV ~ 14% Distribution of Measurements Distribution of measurement variability

23 Case 2b: “Weird” Stuff I passed my GR&R but I’m still getting “weird” results. GR&R 101: “Metrics” P/T expresses the total measurement variability as a percentage of the tolerance width of the process: Here P/T ~ 12.5% Spec. limits Distribution of measurement error

24 Case 2b: “Weird” Stuff I passed my GR&R but I’m still getting “weird” results. GR&R 101: “Metrics” P/TV P/T Very good <10% Marginal 10 – 30% Needs Improvement > 30% Simple, right? Not so fast…

25 Case 2b: “Weird” Stuff I passed my GR&R but I’m still getting “weird” results. R chart by operator: Points inside control limits indicate that operator is consistent between repeat measurements made on same sample (GOOD) Points outside control limits indicate that operator is not consistent between repeat measurements made on same sample (BAD)

26 Case 2b: “Weird” Stuff I passed my GR&R but I’m still getting “weird” results. Example P/T = 22% P/TV = 16%

27 Case 2b: “Weird” Stuff I passed my GR&R but I’m still getting “weird” results. Example P/T = 70% P/TV = 40% P/T = 10% P/TV = 6%

28 Case 2b: “Weird” Stuff I passed my GR&R but I’m still getting “weird” results. So… what caused this? Example Camera Lens Ring Light Pin Tip Position

29 Case 2c: Time Effects The speed of Information is finite.
Information can come from different distances.

30 Case 2c: Time Effects The speed of Information is finite.
Information can come from different distances. Pluto: 5.5 light-hours away Proxima Centauri: 4.2 light-years away Sun: 8 light-minutes away Moon: 1.2 light-seconds away Mars: 12.5 light-minutes away

31 Case 2c: Time Effects The speed of Information is finite.
Information can come from different distances. Just because you “observe” (measure / see) several events “at the same time”, doesn’t mean they all occur(red) at the same time.

32 Case 2c: Time Effects Arranged by order of observation
Arranged by order of occurrence Example

33 Case 2c: Time Effects What can you do?
Collect the data as close as possible to the origin of the event you are observing. “Traceability” of the events you are observing. “De-convolution” of the data. In mathematics, de-convolution is an algorithm-based process used to reverse the effects of convolution on recorded data

34 So… What did we learn? Blind reliance on some index value (Cpk, Cp, P/T, P/TV,…) to tell you what is going on might get you in trouble. Always: Make sure you understand how the index is calculated Use the approach fully, not half-way Verify that all assumptions were met Data stratification opportunities abound. Identify them early on in your project. A few simple rules of thumb will quickly help you determine if you have a chance of having a good measurement system.

35 So… What did we learn? Further analysis of the Gage R&R data can provide you with some great insights into and improvement opportunities for your measurement process. Data has a finite speed. Being aware of this and planning for it during your measure phase will help keep you on the right track.

36 Parting Thoughts My organization doesn’t use Six Sigma, do these insights benefit me as well?

37 Thank You Questions?


Download ppt "Common Mistakes and Misconceptions During Data Collection and Analysis"

Similar presentations


Ads by Google