Presentation is loading. Please wait.

Presentation is loading. Please wait.

QPRC June Wookyeon Hwang Univ. of South Carolina George Runger Industrial Engineering Industrial, Systems, and Operations Engineering.

Similar presentations


Presentation on theme: "QPRC June Wookyeon Hwang Univ. of South Carolina George Runger Industrial Engineering Industrial, Systems, and Operations Engineering."— Presentation transcript:

1 QPRC June 2009runger@asu.edu1 Wookyeon Hwang Univ. of South Carolina George Runger Industrial Engineering Industrial, Systems, and Operations Engineering School of Computing, Informatics, and Decision Systems Engineering Arizona State University Eugene Tuv Intel Process Monitoring with Supervised Learning and Artificial Contrasts

2 QPRC June 2009runger@asu.edu2 2 Statistical Process Control /Anomaly Detection Objective is to detect change in a system –Transportation, environmental, security, health, processes, etc. In modern approach, leverage massive data –Continuous, categorical, missing, outliers, nonlinear relationships Goal is a widely-applicable, flexible method –Normal conditions and fault type unknown Capture relationships between multiple variables –Learn patterns, exploit patterns –Traditional Hotellings T 2 captures structure, provides control region (boundary), quantifies false alarms

3 QPRC June 2009runger@asu.edu3 Traditional Monitoring Traditional approach is Hotellings (1948) T- squared chart Numerical measurements, based on multivariate normality Simple elliptical pattern (Mahalanobis distance) Time-weighted extensions, exponentially weighted moving average, and cumulative sum –More efficient, but same elliptical patterns runger@asu.edu3

4 QPRC June 2009runger@asu.edu4 Transform to Supervised Learning Process monitoring can be transformed to a supervised learning problem –One approach--supplement with artificial, contrasting data –Any one of multiple learners can be used, without pre- specified faults –Results can generalize monitoring in several directionssuch as arbitrary (nonlinear) in-control conditions, fault knowledge, and categorical variables –High-dimensional problems can be handled with an appropriate learner

5 QPRC June 2009runger@asu.edu5 5 Learn Process Patterns Learn pattern compared to structureless alternative Generate noise, artificial data without structure to differentiate –For example, f(x) = f 1 (x 1 )… f 2 (x 2 ) joint distribution as product of marginals (enforce independence) –Or f(x) = product of uniforms Define & assign y = +/–1 to actual and artificial data, artificial contrast Use supervised (classification) learner to distinguish the data sets –Only simple examples used here

6 QPRC June 2009runger@asu.edu6 6 Learn Pattern from Artificial Contrast

7 QPRC June 2009runger@asu.edu7 7 Regularized Least Squares (Kernel Ridge) Classifier with Radial Basis Functions Model with a linear combination of basis functions Smoothness penalty controls complexity –Tightly related to Support Vector Machines (SVM) –Regularized least squares allows closed form solution, trades it for sparsity, may not want to trade! Previous example: challenge for a generalized learner-- multivariate normal data! f(x) x1x1 x2x2

8 QPRC June 2009runger@asu.edu8 8 RLS Classifier where with parameters, Solution

9 QPRC June 2009runger@asu.edu9 9 Patterns Learned from Artificial Contrast RLSC True Hotellings 95% probability bound Red: learned contour function to assign +/-1 Actual: n = 1000 Artificial: n = 2000 Complexity: 4/3000 Sigma 2 = 5

10 QPRC June 2009runger@asu.edu10 More Challenging Example with Hotellings Contour

11 QPRC June 2009runger@asu.edu11 Patterns Learned from Artificial Contrast RLSC Actual: n = 1000 Artificial: n = 2000 Complexity: 4/3000 Sigma 2 = 5

12 QPRC June 2009runger@asu.edu12 Patterns Learned from Artificial Contrast RLSC n Actual: n = 1000 Artificial: n = 1000 n Complexity: 4/2000 n Sigma 2 = 5

13 QPRC June 2009runger@asu.edu13 RLSC for p = 10 dimensions Shift = 1 Training error (Type II error) Testing error (Type II error) Chi-squared (99.5%) (Type II error) Mean0.00666 0.9800.982 StDev0.000570.00305 Shift = 3 Mean0.005 0.4870.489 StDev0.002640.0483

14 QPRC June 2009runger@asu.edu14 Tree-Based Ensembles p = 10 Alternative learner –works with mixed data –elegantly handle missing data –scale invariant –outlier resistance –insensitive to extraneous predictors Provide an implicit ability to select key variables Shift = 1 Training error (Type I error) OOB for training data Testing error (Type II error) OOB for test data Chi-squared (99.5%) (Type II error) Mean00.00233 0.989 0.0026 0.982 StDe v00.001520.00750.0011 Shift = 3 Mean00.00266 0.532 0.0033 0.489 StDev00.001150.22700.0023

15 QPRC June 2009runger@asu.edu15 Nonlinear Patterns Hotellings boundarynot a good solution when patterns are not linear Control boundaries from supervised learning captures the normal operating condition

16 QPRC June 2009runger@asu.edu16 Tuned Control Extend to incorporate specific process knowledge of faults Artificial contrasts generated from the specified fault distribution –or from a mixture of samples from different fault distributions Numerical optimization to design a control statistic can be very complicated –maximizes the likelihood function under a specified fault (alternative)

17 QPRC June 2009runger@asu.edu17 Tuned Control Fault: means of both variables x 1 and x 2 are known to increase Artificial data (black) are sampled from 12 independent normal distributions –Mean vectors are selected from a grid over the area [0, 3] x [0, 3] Learned control region is shown in the right panelapprox. matches the theoretical result in Testik et al., 2004.

18 QPRC June 2009runger@asu.edu18 Incorporate Time-Weighted Rules What form of statistic should be filtered and monitored? –Log likelihood ratio Some learners provide call probability estimates Bayes theorem (for equal sample size) gives Log likelihood ratio for an observation x t estimated as Apply EWMA (or CUSUM, etc.) to l t

19 QPRC June 2009runger@asu.edu19 Time-Weighted ARLs ARLs for selected schemes applied to l t statistic –10-dimensional, independent normal

20 QPRC June 2009runger@asu.edu20 Example: 50 Dimensions runger@asu.edu20

21 QPRC June 2009runger@asu.edu21 Example: 50 Dimensions Hotellings: left Artificial contrast: right runger@asu.edu21

22 QPRC June 2009runger@asu.edu22 Example: Credit Data (UCI) 20 attributes: 7 numerical and 13 categorical Associated class label of good or bad credit risk Artificial data generated from continuous and discrete uniform distributions, respectively, independently for each attribute Ordered by 300 good instances followed by 300 bad 22runger@asu.edu

23 QPRC June 2009runger@asu.edu23 Artificial Contrasts for Credit Data Plot of l t over time 23runger@asu.edu

24 QPRC June 2009runger@asu.edu24 Diagnostics: Contribution Plots 50 dimensions: 2 contributors, 48 noise variables (scatter plot projections to contributor variables)

25 QPRC June 2009runger@asu.edu25 Contributor Plots from PCA T2

26 QPRC June 2009runger@asu.edu26 Contributor Plots from PCA SPE

27 QPRC June 2009runger@asu.edu27 Contributor Plots from Artificial Contrast Ensemble (ACE) Impurity importance weighted by means of split variable

28 QPRC June 2009runger@asu.edu28 Contributor Plots for Nonlinear System Contributor plots from SPE, T2 and ACE in left, center, right, respectively

29 QPRC June 2009runger@asu.edu29 Conclusions Can/must leverage the automated-ubiquitous, data- computational environment –Professional obsolesce Employ flexible, powerful control solution, for broad applications: environment, health, security, etc., as well as manufacturing –Normal sensors not obvious, patterns not known Include automated diagnosis –Tools to filter to identify contributors Computational feasibility in embedded software This material is based upon work supported by the National Science Foundation under Grant No. 0355575.


Download ppt "QPRC June Wookyeon Hwang Univ. of South Carolina George Runger Industrial Engineering Industrial, Systems, and Operations Engineering."

Similar presentations


Ads by Google