Modeling of Core Protection Calculator System Software February 28, 2005 Kim, Sung Ho Kim, Sung Ho.

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

A Tutorial on Learning with Bayesian Networks
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for 1 Lecture Notes for E Alpaydın 2010.
G53MLE | Machine Learning | Dr Guoping Qiu
A New Algorithm of Fuzzy Clustering for Data with Uncertainties: Fuzzy c-Means for Data with Tolerance Defined as Hyper-rectangles ENDO Yasunori MIYAMOTO.
INTRODUCTION TO MACHINE LEARNING Bayesian Estimation.
CPSC 502, Lecture 15Slide 1 Introduction to Artificial Intelligence (AI) Computer Science cpsc502, Lecture 15 Nov, 1, 2011 Slide credit: C. Conati, S.
Rulebase Expert System and Uncertainty. Rule-based ES Rules as a knowledge representation technique Type of rules :- relation, recommendation, directive,
Using data sets to simulate evolution within complex environments Bruce Edmonds Centre for Policy Modelling Manchester Metropolitan University.
LECTURE 11: BAYESIAN PARAMETER ESTIMATION
Research Methods for Counselors COUN 597 University of Saint Joseph Class # 8 Copyright © 2015 by R. Halstead. All rights reserved.
What is Statistical Modeling
The loss function, the normal equation,
Evaluation.
A Classification Approach for Effective Noninvasive Diagnosis of Coronary Artery Disease Advisor: 黃三益 教授 Student: 李建祥 D 楊宗憲 D 張珀銀 D
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Supervised classification performance (prediction) assessment Dr. Huiru Zheng Dr. Franscisco Azuaje School of Computing and Mathematics Faculty of Engineering.
1 Learning Entity Specific Models Stefan Niculescu Carnegie Mellon University November, 2003.
Evaluation.
Evaluating Hypotheses
Data Mining with Decision Trees Lutz Hamel Dept. of Computer Science and Statistics University of Rhode Island.
CSE 300: Software Reliability Engineering Topics covered: Software metrics and software reliability Software complexity and software quality.
Chapter 3 Hypothesis Testing. Curriculum Object Specified the problem based the form of hypothesis Student can arrange for hypothesis step Analyze a problem.
Data Mining: A Closer Look Chapter Data Mining Strategies (p35) Moh!
Data Mining: A Closer Look
ROUGH SET THEORY AND FUZZY LOGIC BASED WAREHOUSING OF HETEROGENEOUS CLINICAL DATABASES Yiwen Fan.
Bayesian Decision Theory Making Decisions Under uncertainty 1.
Dependency networks Sushmita Roy BMI/CS 576 Nov 26 th, 2013.
INTRODUCTION TO MACHINE LEARNING. $1,000,000 Machine Learning  Learn models from data  Three main types of learning :  Supervised learning  Unsupervised.
COMP3503 Intro to Inductive Modeling
BsysE595 Lecture Basic modeling approaches for engineering systems – Summary and Review Shulin Chen January 10, 2013.
Bayesian networks Classification, segmentation, time series prediction and more. Website: Twitter:
Learning from Observations Chapter 18 Through
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
Biological data mining by Genetic Programming AI Project #2 Biointelligence lab Cho, Dong-Yeon
Computational Intelligence: Methods and Applications Lecture 12 Bayesian decisions: foundation of learning Włodzisław Duch Dept. of Informatics, UMK Google:
Reducing the Response Time for Data Warehouse Queries Using Rough Set Theory By Mahmoud Mohamed Al-Bouraie Yasser Fouad Mahmoud Hassan Wesam Fathy Jasser.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
Today Ensemble Methods. Recap of the course. Classifier Fusion
Ensembles. Ensemble Methods l Construct a set of classifiers from training data l Predict class label of previously unseen records by aggregating predictions.
Estimating Component Availability by Dempster-Shafer Belief Networks Estimating Component Availability by Dempster-Shafer Belief Networks Lan Guo Lane.
Probability Course web page: vision.cis.udel.edu/cv March 19, 2003  Lecture 15.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Chapter5: Evaluating Hypothesis. 개요 개요 Evaluating the accuracy of hypotheses is fundamental to ML. - to decide whether to use this hypothesis - integral.
Active learning Haidong Shi, Nanyi Zeng Nov,12,2008.
Chapter1: Introduction Chapter2: Overview of Supervised Learning
Molecular Classification of Cancer Class Discovery and Class Prediction by Gene Expression Monitoring.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Learning and Acting with Bayes Nets Chapter 20.. Page 2 === A Network and a Training Data.
Statistical Inference Statistical inference is concerned with the use of sample data to make inferences about unknown population parameters. For example,
Computational Intelligence: Methods and Applications Lecture 33 Decision Tables & Information Theory Włodzisław Duch Dept. of Informatics, UMK Google:
Machine Learning 5. Parametric Methods.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
Computacion Inteligente Least-Square Methods for System Identification.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
1 Context-aware Data Mining using Ontologies Sachin Singh, Pravin Vajirkar, and Yugyung Lee Springer-Verlag Berlin Heidelberg 2003, pp Reporter:
Chapter 7. Classification and Prediction
Linear Regression.
Discussions on Software Reliability
Meredith L. Wilcox FIU, Department of Epidemiology/Biostatistics
Medical Diagnosis via Genetic Programming
The Maximum Likelihood Method
Data Mining Lecture 11.
Overview of Supervised Learning
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Revision (Part II) Ke Chen
Learning Probabilistic Graphical Models Overview Learning Problems.
Model generalization Brief summary of methods
Parametric Methods Berlin Chen, 2005 References:
Presentation transcript:

Modeling of Core Protection Calculator System Software February 28, 2005 Kim, Sung Ho Kim, Sung Ho

1 TABLE OF CONTENTS q Introduction q Classification Modeling q Dependency Modeling q Application of Modeling to CPCS q Further Works q References Kim, Sung Ho

2 Introduction q Current CPCS Software  performs the calculation of DNBR and LPD for reactor trip generation  uses process variables and operator-addressable constants for calculation  does not have any formal method to express system functions and software quality  requires some formal expressions for the system functions and quantitative software reliability which can be used to improve the software quality Core Protection Calculator System Software CEA position Excore neutron flux signal Hot leg temperature Cold leg temperature RCP shaft speed Pressurizer Pressure DBNT Trip LPD Trip Coefficients of Equations CWP

3 Kim, Sung Ho Introduction q Modeling of the CPCS Algorithm  is recommended for formal expression of the system functions  can be used for the evaluation of relationships between variables  can be used for the software test coverage to suggest the software reliability  Bayesian Way of System Modeling  Formalization of the software The relationship between process input variables and output values of the CPCS can be modeled using Bayesian concept of conditional probability  Test case confirmation The variances of input values and corresponding variances of output values can be used to check the coverage of test cases for the system software  Some modeling methods for trial classification modeling dependency modeling etc.

4 Kim, Sung Ho q What is Classification Modeling class  To divide the variables of a system into a class variable and the rest as predictor predictor variables  To find a model such that if the values of predictor variables are given, then the values of the class variable are inferred  To use as a prediction model for the values of class variable using the values of predictor variables  Bayesian theory can give us tools to merge many quantitative models to build a classification model by adopting the concept of probabilities of the parameters  search for certain kind of Bayesian network structures  Naïve Bayes’ model is considered to be optimal for classification modeling due to its special form that connects all predictors to the class variable, can be said that all the predictors are dependent on the class variable, and can be said that all the predictors are independent of each other once we know the value of the class variable Classification Modeling

5 Kim, Sung Ho q Example of Classification Modeling: Detecting Heart Disease  Class variable and predictor variables Variable NameData Type AgeNumerical SexMale/Female Chest pain type1, 2, 3, 4 Resting blood pressureNumerical Serum cholestoralNumerical Fasting blood sugarYes/No Resting electrocardiographic results0, 1, 2, 3 Max. heart rate achieveNumerical Exercise induced anginaYes/No ST depression induced by exercise relative to restNumerical Slope of the peak exercise ST segmentNumerical No. of major vessels colored by fluoroscopy0, 1, 2, 3 ThalNormal/Fixed defect/Reversible defect Heart diseaseAbsence/Presence Class variable Predictor variables Classification Modeling

6 Kim, Sung Ho q Best Classification Model  The classification model showing the best possible predictive accuracy a model that most accurately classifies unclassified data vectors  Training classifiers is building the best classifiers  Searching procedure for the best classification model pick one model at random and declare it the best model so far pick another model at random and compare the predictive accuracy of this model to that of the best model so far select more accurate model as the best model and continue  Estimation of predictive accuracy of the classifier (Leave-one-out cross validation)  Remove one data vector from N data matrix and train the classifier with (N-1) data vectors  Insert the removed data vector for value of class  If the classifier pops the correct class, the classifier gets one point  Return the removed data vector to the data matrix and repeat this procedure by removing some other vector for each data vector in the matrix  Sum up the points and divide the points by the number of data vectors to get the average predictive accuracy (in %) Classification Modeling

7 Kim, Sung Ho Classification Modeling q Result of Classification Modeling(1)  Evaluated models for the sample case  Last models have not resulted in finding better models  The classification accuracy is %  The variables Age, Sex, Cholestoral, Sugar, Max, Angina have been left out from the model Arcs (from Class) Reduction of Accuracy #Vessels7.41 % Thal6.67 % Old Peak6.30 % Pain5.56 % Pressure1.85 % Rest1.48 % Slope Peak1.11 % Importance of the variables

8 Kim, Sung Ho Classification Modeling q Result of Classification Modeling(2) – probabilistic classifier

9 Kim, Sung Ho Dependency Modeling q What is Dependency Modeling  Finding the model of the probabilistic dependencies of the variables  Dependencies can be used to speculate about causalities that might cause them  A good dependency model is one with high probability  Each dependency model is a set of statements about the dependencies between sets of variables A and B are dependent on each other if something about C or D are known A and C are dependent on each other on matter what about B or C are known …

10 Kim, Sung Ho q Dependency Modeling using Bayesian Belief Network  Expressing the causalities between variables  Setting up the decision making rules based on formal method  Updating the probability based on the new information  Intelligent components to deduce the non-deterministic conditions in complex system  A dependency model can be represented in simple graphical form using Bayesian Network having the only one Bayesian network representation, or many slightly different Bayesian network representations  But in some case, there does not exist a Bayesian network representation A B DC Dependency Modeling

11 Kim, Sung Ho q Predicting with Dependency Models(1)  A dependency model can be used to calculate conditional probabilities of unknown future observations Dependency Modeling

12 Kim, Sung Ho q Predicting with Dependency Models(2) Dependency Modeling

13 Kim, Sung Ho q Classification Modeling of the CPCS Algorithm  To divide the input variables into DNBR/LPD/CWP classes q Dependency Modeling of the CPCS Algorithm  To know whether or how the variables in CPCS algorithm depend each other  To know the conditions for the dependencies between variables (e.g., some knowledge on other variables)  To assess the software designer’s certainty about the true existence and nature of dependencies between variables using probability theory  CPCS Model to be used for the software test  Model the system using input/output variables and internal constants, if needed to find out the dependencies between input variables, output variables, and constants  Select one input variable  Change the values of the selected variable, and monitor the change of values of constants and outputs  Confirm if we can predict the output values when we know the input values  Confirm if we can generate all the possible range of input process values and constants to cover all the possible conditions of the software Application of Modeling to CPCS

System Outputs Process Input Signals Addressable constants by the operators 14 Kim, Sung Ho q Core Protection Calculator System Algorithm Core Protection Calculator System Software CEA position Excore neutron flux signal Hot leg temperature Cold leg temperature RCP shaft speed Pressurizer Pressure DBNT Trip LPD Trip Coefficients of Equations Application of Modeling to CPCS CWP

15 Kim, Sung Ho q Modeling of CPCS Algorithm  Considered to have combined connections of converging and diverging which need: partition of variables for each class variable after classification modeling manipulation of common variables for each class variable Application of Modeling to CPCS CEA position T Cold P PZR T Hot DNBR Trip LPD Trip CWP

16 Kim, Sung Ho q Modeling of CPCS Algorithm  Considered to be modeled using divorcing to reduce the number of cases for state distributions Application of Modeling to CPCS CEA position T Cold P PZR T Hot DNBR Trip LPD Trip CWP M1M1

17 Kim, Sung Ho q Data Vector for CPCS Modeling Application of Modeling to CPCS

18 Kim, Sung Ho Further Works q Data Vectors for the CPCS variables should be generated  Test cases should be generated  Test results should be generated using the test cases q Dependency Modeling of the CPCS using data vectors should be performed  Dependencies of input variables can be shown for the output values of DNBR, LPD, and CWP  Formal expressions for the CPCS algorithm and software quality should be generated q Test coverage should be checked using the model

19 Kim, Sung Ho References q Finn V. Jensen, “An Introduction to Bayesian Networks”, 1996 q Unit Test Cases for CPCS for SKN 1&2 q

Kim, Sung Ho References 20

Kim, Sung Ho References 21

Kim, Sung Ho References 22