Robin McDougall, Ed Waller and Scott Nokleby Faculties of Engineering & Applied Science and Energy Systems & Nuclear Science 1.

Slides:



Advertisements
Similar presentations
Introduction to Monte Carlo Markov chain (MCMC) methods
Advertisements

Bayesian Belief Propagation
Bayesian inference of normal distribution
Week 11 Review: Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution.
CHAPTER 8 More About Estimation. 8.1 Bayesian Estimation In this chapter we introduce the concepts related to estimation and begin this by considering.
Exact Inference in Bayes Nets
Bayesian Estimation in MARK
Geog 409: Advanced Spatial Analysis & Modelling © J.M. Piwowar1Principles of Spatial Modelling.
Gibbs Sampling Qianji Zheng Oct. 5th, 2010.
Markov-Chain Monte Carlo
Chapter 4: Linear Models for Classification
CHAPTER 16 MARKOV CHAIN MONTE CARLO
Bayesian statistics – MCMC techniques
Visual Recognition Tutorial
Computing the Posterior Probability The posterior probability distribution contains the complete information concerning the parameters, but need often.
Date:2011/06/08 吳昕澧 BOA: The Bayesian Optimization Algorithm.
Computational statistics 2009 Random walk. Computational statistics 2009 Random walk with absorbing barrier.
The University of Texas at Austin, CS 395T, Spring 2008, Prof. William H. Press IMPRS Summer School 2009, Prof. William H. Press 1 4th IMPRS Astronomy.
Machine Learning CUNY Graduate Center Lecture 7b: Sampling.
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
A Probabilistic Approach to Collaborative Multi-robot Localization Dieter Fox, Wolfram Burgard, Hannes Kruppa, Sebastin Thrun Presented by Rajkumar Parthasarathy.
End of Chapter 8 Neil Weisenfeld March 28, 2005.
Visual Recognition Tutorial
Using ranking and DCE data to value health states on the QALY scale using conventional and Bayesian methods Theresa Cain.
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Bayesian Analysis for Extreme Events Pao-Shin Chu and Xin Zhao Department of Meteorology School of Ocean & Earth Science & Technology University of Hawaii-
Image Analysis and Markov Random Fields (MRFs) Quanren Xiong.
CRESCENDO Full virtuality in design and product development within the extended enterprise Naples, 28 Nov
Bayesian parameter estimation in cosmology with Population Monte Carlo By Darell Moodley (UKZN) Supervisor: Prof. K Moodley (UKZN) SKA Postgraduate conference,
Introduction to MCMC and BUGS. Computational problems More parameters -> even more parameter combinations Exact computation and grid approximation become.
Lecture 12 Statistical Inference (Estimation) Point and Interval estimation By Aziza Munir.
1 Institute of Engineering Mechanics Leopold-Franzens University Innsbruck, Austria, EU H.J. Pradlwarter and G.I. Schuëller Confidence.
A Beginner’s Guide to Bayesian Modelling Peter England, PhD EMB GIRO 2002.
Machine Learning Lecture 23: Statistical Estimation with Sampling Iain Murray’s MLSS lecture on videolectures.net:
Module 1: Statistical Issues in Micro simulation Paul Sousa.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 11: Bayesian learning continued Geoffrey Hinton.
Bayesian Inversion of Stokes Profiles A.Asensio Ramos (IAC) M. J. Martínez González (LERMA) J. A. Rubiño Martín (IAC) Beaulieu Workshop ( Beaulieu sur.
Fast Simulators for Assessment and Propagation of Model Uncertainty* Jim Berger, M.J. Bayarri, German Molina June 20, 2001 SAMO 2001, Madrid *Project of.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
Mathematical Models & Optimization?
Suppressing Random Walks in Markov Chain Monte Carlo Using Ordered Overrelaxation Radford M. Neal 발표자 : 장 정 호.
Example: Bioassay experiment Problem statement –Observations: At each level of dose, 5 animals are tested, and number of death are observed.
Bayesian Reasoning: Tempering & Sampling A/Prof Geraint F. Lewis Rm 560:
Lecture 2: Statistical learning primer for biologists
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
- 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.
Application of the MCMC Method for the Calibration of DSMC Parameters James S. Strand and David B. Goldstein The University of Texas at Austin Sponsored.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
Reducing MCMC Computational Cost With a Two Layered Bayesian Approach
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
G. Cowan Lectures on Statistical Data Analysis Lecture 9 page 1 Statistical Data Analysis: Lecture 9 1Probability, Bayes’ theorem 2Random variables and.
G. Cowan Lectures on Statistical Data Analysis Lecture 10 page 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem 2Random variables and.
Computacion Inteligente Least-Square Methods for System Identification.
Introduction to emulators Tony O’Hagan University of Sheffield.
Institute of Statistics and Decision Sciences In Defense of a Dissertation Submitted for the Degree of Doctor of Philosophy 26 July 2005 Regression Model.
SIR method continued. SIR: sample-importance resampling Find maximum likelihood (best likelihood × prior), Y Randomly sample pairs of r and N 1973 For.
Prediction and Missing Data. Summarising Distributions ● Models are often large and complex ● Often only interested in some parameters – e.g. not so interested.
Bayesian Neural Networks
MCMC Output & Metropolis-Hastings Algorithm Part I
MCMC Stopping and Variance Estimation: Idea here is to first use multiple Chains from different initial conditions to determine a burn-in period so the.
Probability Theory and Parameter Estimation I
Ch3: Model Building through Regression
Remember that our objective is for some density f(y|) for observations where y and  are vectors of data and parameters,  being sampled from a prior.
More about Posterior Distributions
CS639: Data Management for Data Science
Markov Networks.
Yalchin Efendiev Texas A&M University
Presentation transcript:

Robin McDougall, Ed Waller and Scott Nokleby Faculties of Engineering & Applied Science and Energy Systems & Nuclear Science 1

Overview Motivation What is a radiation map? Potential problems in their generation Overview of the Proposed Strategy Core components Their functional relationship Illustrative Case Study Present a preliminary simulation study “Sanity-check” 2

Motivation 3 Radiation Maps characterize a radiation field in an easily understood format Can help minimize total exposure by identifying areas with relatively high radiation intensities Number of options for generation Workflow Optimization For these instances it becomes necessary to use modeling based techniques to predict the radiation intensities in areas where a sensor cannot be positioned.

We propose using a five-stage procedure for generating radiation maps based on sparse or incomplete radiation sensor data 5. Data Visualization (Create Map) Overhead Isobars Augmented Reality 1.Select an Appropriate Radiation Model for the environment Representative of the physical environment Two (at least!) concerns: Radiation Intensity Field Radiation Sensor Can support multiple radiation modeling tools: Simplistic; Inverse square law More Complex – MCNP, Microsheild Parameterized appropriately [Radiation Intensities] = f (Source Locations, Source Intensities) 2. Gather Radiation Data Use an suitable scheme Static Sensor Net Human Operators Mobile Robots Take Readings and Methodically Transfer to Physical Layout Maps Record multiple readings for each location (Sensor Variability) 3. Calibrate Radiation Model Use data collected in (2) to calibrate the “Generic” radiation model to the specific instantaneous radiation exposure scenario Choice of method for calibration used should consider: Order of model (Linear? Non-Linear?) Variability, Uncertainty, Sparsity of Data Type of map being generated Regardless of technique, want to infer intensity and location(s) of sources most likely to cause the data observed 4. Generate Data for Map Use the “Calibrated” model from (3) to calculate the “predicted” values at an interval sufficient enough to characterize the radiation intensity field in the area Method - Overview 4 1. Select Radiation Model 2. Gather Radiation Data 5. Data Visualization (Create Map) 3. Calibrate Radiation Model 4. Generate Data for Map

An Example…. 5 Let’s Consider a 20 x 20 area: - Exposed to two radiation sources. One placed at P 1 (2,5) with an intensity of (I=650) and another at P 2 (16,6) and intensity of (I=350). - Sensor data taken from two edges Objective: Generate Radiation Map for the entire region using Radiation modeling

Example – Radiation Model 6 1. Select Radiation Model 2. Gather Radiation Data 5. Data Visualization (Create Map) 3. Calibrate Radiation Model 4. Generate Data for Map 1.Select Radiation Model Choose a radiation model based on the environment Model Parameters In Source Position(s) Source Intensities Data Out Sensor Readings for Sample Locs Radiation Intensity for grid locations

Example – Radiation Model 7 1. Select Radiation Model 2. Gather Radiation Data 5. Data Visualization (Create Map) 3. Calibrate Radiation Model 4. Generate Data for Map 1.Select Radiation Model Two elements – want to model radiation intensity and sensor readings In the preliminary study we used the simplistic inverse square modeling method Radiation intensity at a point “P” was found by summing the contribution from the two sources. Each Contribution was equal to the Intensity at the source divided by the square of the distance.

Example – Radiation Model 8 1. Select Radiation Model 2. Gather Radiation Data 5. Data Visualization (Create Map) 3. Calibrate Radiation Model 4. Generate Data for Map 1.Select Radiation Model The sensor was modeled by sampling the Poisson distribution with using the intensity “P” from the radiation model as the mean. Radiation at Sensor “S” = Pois( P s )

Example – Element Details 9 1. Select Radiation Model 2. Gather Radiation Data 5. Data Visualization (Create Map) 3. Calibrate Radiation Model 4. Generate Data for Map 2.Gather Radiation Data In the “real-world” this would be where the sensor readings are acquired. Sensor Net Operator Robot In this simulated study, the radiation model from (1) was used to synthesize (simulate) radiation readings for each point (5 for each of the 13 pts)

3.Calibrate Radiation Model Two additional design considerations Which likelihood function to use How to explore candidate solutions 3.Calibrate Radiation Model Many quantitative techniques to perform this task, choice depends on system being studied Unique characteristics of this scenario: Potential for very non-linear radiation models Radiation sensing is a discrete probabilistic process More samples taken the more likely mean -> actual Sparse available data imparts uncertainty as well Desirable to capture and characterize these in our estimates for the parameters which describe the source(s) We propose using Bayesian Inference Techniques to sample the LF 3.Calibrate Radiation Model Use an optimization based routine to infer where the sources are using the model from (1) and the data from (2) by maximizing the Likelihood function Iterative process 3.Calibrate Radiation Model Candidate source locations and intensities are proposed The values for their modeled sensor readings (1) are compared with the observed data (2) Updated iteratively until some termination criteria is reached Example – Element Details Select Radiation Model 2. Gather Radiation Data 5. Data Visualization (Create Map) 3. Calibrate Radiation Model 4. Generate Data for Map Compare with Obs.Compare again

Bayesian Inference Background Instead of point estimates for locations and intensities, we want to find distributions of the parameters as implied by a likelihood function Also want to incorporate prior knowledge. Simplest case is upper/lower bounds on parameters, but could be distributions as well Idea is to use Bayes Theorem to find the Joint Posterior distribution, then draw samples from this distribution Once we have the collection of samples, we can get any other kind of statistical information we need: – Marginal densities of parameters – Correlations – Means, modes, etc. 11

Bayes Theorem 12 May 30, 2008 Bayes Theorem Prior distribution: prior knowledge regarding the distribution of the parameters Likelihood function: the probability of observing a set of model outputs (x) given a set of parameters (theta) Posterior distribution: the distribution of the parameters taking into account the likelihood and prior information. Bayes Theorem relates these quantities as follows:

The Posterior Distribution Contains everything we want to know about the distributions of the parameters Get information about the parameter distributions by sampling from the posterior, the performing various analyses of the collections of samples Marginals, correlations, etc Usually impossible to sample from directly Multivariate, no analytic form, usually involves running the simulation 13

How to Sample Posteriors Conventional techniques don’t work E.g., rejection sampling: posterior is close to zero in most places, so almost all samples will be rejected Use Markov Chain Monte Carlo techniques provide a way to sample from the posterior One sample is used to generate the next sample in a “smart” way... this produces “chains” of samples We propose using MCMC techniques based on the Gibbs and Metropolis-Hastings sampling algorithms 14

MCMC Configuration 15

MCMC Results For this study: Recall the system has two sources of Radiation Resulting Chains from the MCMC Study - 6 Parameters / 6 Chains

Example – Element Details Select Radiation Model 2. Gather Radiation Data 5. Data Visualization (Create Map) 3. Calibrate Radiation Model 4. Generate Data for Map 4. Generate data for map Use a Forward Monte Carlo (FMC) For each FMC iteration, randomly select a value from the posterior distributions for each parameter (3) Run the model (1) with this candidate-set and record predicted intensities for a sufficient number of points

Example – Element Details Select Radiation Model 2. Gather Radiation Data 5. Data Visualization (Create Map) 3. Calibrate Radiation Model 4. Generate Data for Map 5. Visualization / Map Generation Process FMC results appropriately and generate map For example: Take 90 th percentile radiation intensities for each grid intersection (400 pts) Plot intensity isobars

Final Thoughts…. In preliminary study presented here the model used in the Map Generation Tool was “perfect” – the same model was used to synthesize the data in the study The platform and procedure themselves are generic enough that extension to more sophisticated radiation models should be straight-forward. 19

20 Acknowledgements: University Network of Excellence in Nuclear Engineering Natural Sciences and Engineering Research Council Thank you for your time! Questions?