Presentation is loading. Please wait.

Presentation is loading. Please wait.

University of HelsinkiSakari Kuikka One-out all-out principle or Bayesian Integration ? Sakari Kuikka: University of Helsinki Seppo Rekolainen: Finnish.

Similar presentations


Presentation on theme: "University of HelsinkiSakari Kuikka One-out all-out principle or Bayesian Integration ? Sakari Kuikka: University of Helsinki Seppo Rekolainen: Finnish."— Presentation transcript:

1 University of HelsinkiSakari Kuikka One-out all-out principle or Bayesian Integration ? Sakari Kuikka: University of Helsinki Seppo Rekolainen: Finnish Environmental Institute Mikko Mukula: University of Helsinki Jouni Tammi: University of Helsinki Laura Uusitalo: University of Helsinki

2 University of HelsinkiSakari Kuikka 2 FEM research group at the University of Helsinki 1 professor, 3 postdoctoral researchers, 6 postgraduate researchers, 2 graduate students 2 locations: Helsinki and Kotka Research interests: –Decision analysis of renewable resources –Integrating different sources of data and other knowledge: Bayesian analysis –Identification and quantification of risks in the use of natural resources –Analysis of management of natural resources in the face of risks and uncertainty in the information and control => User of information in an essential role

3 University of HelsinkiSakari Kuikka 3 Aim of data collection and data analysis to increase the probability of correct decision making Correct? = achieving aim with high probability, or avoiding problem with high probability (like ”points of no return”)

4 University of HelsinkiSakari Kuikka 4 Objectives of the talk 1)To briefly discuss the sources of uncertainty 2)To briefly represent the Bayes theory 3)To represent a classification model based on the Bayes rule in classification 4)To compare the results to “one-out all-out” principle

5 University of HelsinkiSakari Kuikka 5 Number of elements and chance for misclassification EU CIS Ecostat Guidance 2003

6 University of HelsinkiSakari Kuikka 6 Risk: e.g. probability to be, or to go, above a critical threshold?  Probabilistic calculus may be needed for a correct decision (dioxin or P load ”of no return”)

7 University of HelsinkiSakari Kuikka 7 Risk Risk = probability * loss Two alternative coin games: A)0.5 * 1000 euros and 0.5 * (- 1000 euros) or B) 0.5 * 10 000 euros and 0.5 * (-10 000 euros) I would pay at least 500 – 2 000 euros to get the first game instead of the second.

8 University of HelsinkiSakari Kuikka 8 Sources of uncertainty Variability over: time, space, measurements, uncertainty in model selection E.g. several visits to the same lake can produce different measurements/assessment values E.g. a lake can naturally have poor benthos (e.g. due to high fish predation?) => causalities are not allways deterministic

9 University of HelsinkiSakari Kuikka 9 Uncertainties So, there are uncertainties: 1) In measurements (mostly this here) 2) In causal relationships of nature It is diffult to separate these in a data analysis!

10 University of HelsinkiSakari Kuikka 10 Bayes rule P (b|a) = ------------------------ P (a|b) P (b) P (a) a: data, observations, etc. b: probability of parameter value, or hypothesis Note:all argumentation is based on probability distributions, not on single values!

11 University of HelsinkiSakari Kuikka 11 Likelihood P (measurement | correct value) E.g. if correct value is 10, we may have: Measurement Probability 12 0.2 10 0.6 8 0.2 So, measurement 12 can be linked to several real values of the lake !

12 University of HelsinkiSakari Kuikka 12 Bayes rule: probabilistic dependencies Real number of fish (B) Observations (data), A Observations Real number of fish P (A|B) P (B|A)

13 University of HelsinkiSakari Kuikka 13 Bayesian inference: P(N | data)  P(data | N) P(N)

14 University of HelsinkiSakari Kuikka 14 Disretization

15 University of HelsinkiSakari Kuikka 15 Applying Bayes rule Several uncertain, but supporting information sources increase the total evidence (=decreases uncertainty) In WFD, the probability (posterior) of a certain classification result, obtained after the probabilistic assessment result of first quality element (e.g. fishes), could be used as a prior for the analysis of the next element. And also should = all quality elements have their own role => learning process of science

16 University of HelsinkiSakari Kuikka 16 Model structure + submodels (naive nets) under each element !

17 University of HelsinkiSakari Kuikka 17 Sub models: naive Bayesian nets Class Sp_1Sp_2Sp_3Sp_4Sp_5 Generally speaking, best methodology to classify

18 University of HelsinkiSakari Kuikka 18 Data in this analysis:input to naive nets Only one lake type Fish stock data: 80 lakes, gillnet Phytoplankton: 1330 samples Benthos: 71 samples (22 lakes) Macrophytes: 70 surveys (47 lakes) ”Truth” needed to test the method= arbitrary value of phosphorus was selected as a classifier for lake class

19 University of HelsinkiSakari Kuikka 19 Analysis of data Classes: OK (high or good = < 30 ug TP/l ) Restore (moderate or less = > 30 ug TP/l ) Probability of correct classification: leaving out one data point at time from parameter estimation, and using biological information of that data point to classify (the phosphorus of) that lake (weka software) Data Left out

20 University of HelsinkiSakari Kuikka 20 Model assumptions ”One out - all out”: total assessment is ”restore”, if one of the components goes to ”restore” Same model to test how Bayes rule works in classification Each element was analyzed with a separate, specific model (naive Bayes net). This ”meta-model” uses likelihoods estimated by those (also integrating) submodels

21 University of HelsinkiSakari Kuikka 21 Results 1: Likelihoods (probabilities of correct/uncorrect classifications) Truth Assessm. Fish Macroph. Benthos Phytopl OK OK 0.92 0.93 0.75 0.91 OK Restore 0.08 0.07 0.25 0.09 Restore Restore 0.77 0.69 0.65 0.79 Restore OK 0.23 0.31 0.35 0.21 Estimated by naive submodels for each element The results of the last line are problematic!

22 University of HelsinkiSakari Kuikka 22 Results 2: one-out all-out Applying one-out, all-out: If lake is restore, P(assesm=resto)=0.99 If lake is OK, P(assesm=resto) = 0.37 ! (or even higher, depending on some details) = Potential for misclassification, i.e. lot of mismanagent!

23 University of HelsinkiSakari Kuikka 23 Results 3: Bayes rule/1 Applying Bayes rule for single oservation & naive net assessment (starting from prior = 0.5): obs: macr=OK; P (lake=OK) = 0.68 obs: fish=OK; P (lake=OK) = 0.80 Bayes rule for 2 joined observations: obs: macr=OK, fish=OK; P(lake=OK) = 0.89 obs: benth=OK, phyt=OK; P(lake=OK) = 0.87 obs: macr=resto, fish=resto; P (lake=resto) = 0.99

24 University of HelsinkiSakari Kuikka 24 Conclusions I: ”One out all out” The problem of the ”one out all out principle” is in the relatively high uncertainty between the real state of nature and the assessment result, i.e. in the likelihood functions (especially benthos in this data set) The more there are uncertain elements, the more likely is ”false alarm”

25 University of HelsinkiSakari Kuikka 25 Conclusions II: Bayes model Bayes rule helps to integrate uncertain evidence from several sources Assessment result ”restore” is likely to be correct with a Bayesian model Assessment result ”OK” is more uncertain, as it may mean a ”restore” lake (see likelihood relationships) Bayesian models are easiers and cheaper way to decrease uncertainty than increased monitoring effort

26 University of HelsinkiSakari Kuikka 26 Results 1: Likelihoods (probabilities of correct/uncorrect classifications) Truth Assessm. Fish Macroph. Benthos Phytopl OK OK 0.92 0.93 0.75 0.91 OK Restore 0.08 0.07 0.25 0.09 Restore Restore 0.77 0.69 0.65 0.79 Restore OK 0.23 0.31 0.35 0.21 Estimated by naive submodels for each element The results of the last line are problematic!

27 University of HelsinkiSakari Kuikka 27 Conclusions III: Management There is clearly a need to link management decisions (program of measures) to the classification: they would give a content for the uncertainty in classification (=probability for misallocation of money?) We suggest that probability of misclassification is a policy issue, not a scientific issue Classification models may have an impact on interest to collect/improve data?

28 University of HelsinkiSakari Kuikka 28 Way forward I: Risk assessment and Risk Management Pressure CHL or ”P level of no return” AB C A = point estimate level B= risk averse attitude in threshold only C= implementation uncertainty included

29 University of HelsinkiSakari Kuikka 29 Conclusions IV Risk assessment and risk management must be separated (ref. to Scientific, Technical and Economic Committee for Fisheries) Framework directive = should risk attitude be country specific ? On which values of society it must be based on? Does the number of people per lake have an impact on management conclusions? (public participation = mechanism to bring in values)

30 University of HelsinkiSakari Kuikka 30 Conclusions V Bayesian network methodology is easy: one week education to start with your data Conceptual part is more difficult, but far more easy than understanding the real information contents of test statistics in ”classical statistics” Bayesian parameter estimation (in some areas ”the most correct way to do it”) with e.g. Winbugs software is more difficult, but achievable in 6 – 8 months of work Education !!!! = Marie Curie activities, join with fisheries ?

31 University of HelsinkiSakari Kuikka 31 Way forward II: Multiobjective valuation Improved lake Ecolog.statusFishingRecr. inter. FishMacrop. SwimmingBoating Kg/ha CPUE goals objectives (weights 0 -1) criteria alternatives Lake 1 Lake 3 Lake 2 An example of the value-tree Anne-Marie Hagman Mika Marttunen SYKE

32 University of HelsinkiSakari Kuikka 32 Way forward II: Example of ranking By: Anne-Marie Hagman Mika Marttunen SYKE


Download ppt "University of HelsinkiSakari Kuikka One-out all-out principle or Bayesian Integration ? Sakari Kuikka: University of Helsinki Seppo Rekolainen: Finnish."

Similar presentations


Ads by Google