Introduction to MCMC and BUGS. Computational problems More parameters -> even more parameter combinations Exact computation and grid approximation become.

Slides:



Advertisements
Similar presentations
Introduction to Monte Carlo Markov chain (MCMC) methods
Advertisements

Other MCMC features in MLwiN and the MLwiN->WinBUGS interface
NCeSS e-Stat quantitative node Prof. William Browne & Prof. Jon Rasbash University of Bristol.
Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Markov Chain Monte Carlo Convergence Diagnostics: A Comparative Review By Mary Kathryn Cowles and Bradley P. Carlin Presented by Yuting Qi 12/01/2006.
METHODS FOR HAPLOTYPE RECONSTRUCTION
By Addison Euhus, Guidance by Edward Phillips An Introduction To Uncertainty Quantification.
Introduction of Markov Chain Monte Carlo Jeongkyun Lee.
Gibbs Sampling Qianji Zheng Oct. 5th, 2010.
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
Bayesian statistics – MCMC techniques
Gaussian Processes to Speed up Hamiltonian Monte Carlo Matthieu Lê Journal Club 11/04/141 Neal, Radford M (2011). " MCMC Using Hamiltonian Dynamics. "
Stochastic approximate inference Kay H. Brodersen Computational Neuroeconomics Group Department of Economics University of Zurich Machine Learning and.
BAYESIAN INFERENCE Sampling techniques
Industrial Engineering College of Engineering Bayesian Kernel Methods for Binary Classification and Online Learning Problems Theodore Trafalis Workshop.
. PGM: Tirgul 8 Markov Chains. Stochastic Sampling  In previous class, we examined methods that use independent samples to estimate P(X = x |e ) Problem:
Variational Methods TCD Interests Simon Wilson. Background We are new to this area of research – so we can’t say very much about it – but we’re enthusiastic!
Computational statistics 2009 Random walk. Computational statistics 2009 Random walk with absorbing barrier.
Course overview Tuesday lecture –Those not presenting turn in short review of a paper using the method being discussed Thursday computer lab –Turn in short.
Monte Carlo Localization
Today Introduction to MCMC Particle filters and MCMC
End of Chapter 8 Neil Weisenfeld March 28, 2005.
Bayesian Filtering for Location Estimation D. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello Presented by: Honggang Zhang.
Everything you ever wanted to know about BUGS, R2winBUGS, and Adaptive Rejection Sampling A Presentation by Keith Betts.
Robin McDougall, Ed Waller and Scott Nokleby Faculties of Engineering & Applied Science and Energy Systems & Nuclear Science 1.
Overview G. Jogesh Babu. Probability theory Probability is all about flip of a coin Conditional probability & Bayes theorem (Bayesian analysis) Expectation,
Introduction to WinBUGS Olivier Gimenez. A brief history  1989: project began with a Unix version called BUGS  1998: first Windows version, WinBUGS.
WinBUGS Demo Saghir A. Bashir Amgen Ltd, Cambridge, U.K. 4 th January 2001.
R2WinBUGS: Using R for Bayesian Analysis Matthew Russell Rongxia Li 2 November Northeastern Mensurationists Meeting.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
Monte Carlo Methods1 T Special Course In Information Science II Tomas Ukkonen
Mixture Models, Monte Carlo, Bayesian Updating and Dynamic Models Mike West Computing Science and Statistics, Vol. 24, pp , 1993.
Suppressing Random Walks in Markov Chain Monte Carlo Using Ordered Overrelaxation Radford M. Neal 발표자 : 장 정 호.
Instructor: Eyal Amir Grad TAs: Wen Pu, Yonatan Bisk Undergrad TAs: Sam Johnson, Nikhil Johri CS 440 / ECE 448 Introduction to Artificial Intelligence.
Simulation techniques Summary of the methods we used so far Other methods –Rejection sampling –Importance sampling Very good slides from Dr. Joo-Ho Choi.
Bayes’ Nets: Sampling [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available.
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
The famous “sprinkler” example (J. Pearl, Probabilistic Reasoning in Intelligent Systems, 1988)
1 Francisco José Vázquez Polo [ Miguel Ángel Negrín Hernández [ {fjvpolo or
An Introduction to Markov Chain Monte Carlo Teg Grenager July 1, 2004.
Seminar on random walks on graphs Lecture No. 2 Mille Gandelsman,
Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
Reducing MCMC Computational Cost With a Two Layered Bayesian Approach
HCI/ComS 575X: Computational Perception Instructor: Alexander Stoytchev
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Stochastic Frontier Models
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Bayesian Modelling Harry R. Erwin, PhD School of Computing and Technology University of Sunderland.
Bayesian statistics named after the Reverend Mr Bayes based on the concept that you can estimate the statistical properties of a system after measuting.
1 Getting started with WinBUGS Mei LU Graduate Research Assistant Dept. of Epidemiology, MD Anderson Cancer Center Some material was taken from James and.
Anders Nielsen Technical University of Denmark, DTU-Aqua Mark Maunder Inter-American Tropical Tuna Commission An Introduction.
Modelling Complex Systems Video 4: A simple example in a complex way.
Efficiency Measurement William Greene Stern School of Business New York University.
How many iterations in the Gibbs sampler? Adrian E. Raftery and Steven Lewis (September, 1991) Duke University Machine Learning Group Presented by Iulian.
Overview G. Jogesh Babu. R Programming environment Introduction to R programming language R is an integrated suite of software facilities for data manipulation,
Hierarchical Models. Conceptual: What are we talking about? – What makes a statistical model hierarchical? – How does that fit into population analysis?
10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School.
Introduction to Sampling based inference and MCMC
MCMC Output & Metropolis-Hastings Algorithm Part I
Advanced Statistical Computing Fall 2016
Introduction to the bayes Prefix in Stata 15
Introduction to particle filter
Markov Networks.
Predictive distributions
Course on Bayesian Methods in Environmental Valuation
the goal of Bayesian divergence time estimation
Ch13 Empirical Methods.
Markov Networks.
Presentation transcript:

Introduction to MCMC and BUGS

Computational problems More parameters -> even more parameter combinations Exact computation and grid approximation become infeasible The posterior distribution can be approximated faster by simulation techniques Simulation: drawing random numbers from a distribution

Simulation

Markov chain Monte Carlo Draw random numbers from the posterior distribution Each number depends on the previous one Start from arbitrary value Simulation “finds” the posterior distribution and provides random numbers from it Advantage: very complex models can be analysed Disadvantage: length of the searching phase is difficult to identify

Remarks MCMC is only one way to approximate the posterior distribution. It is not a modeling approach itself! There are other methods as well: Importance sampling Sampling- Importance –Resampling Sequential Monte Carlo (particle filter) etc. MCMC usually easier to set up

BUGS Bayesian Inference Using Gibbs Sampling Gibbs sampling is one form of MCMC 1. User specifies the model structure 2. User inputs the data 3. Software uses Bayes’ theorem and constructs an MCMC algorithm to sample from the posterior 4. User assesses the convergence of MCMC 5. Use software to calculate descriptive statistics and produce figures

Using BUGS Lots of pointing and clicking Some time is needed to learn the procedure Also a scripting language is available Routine points and clicks can be automated Useful once you know how to point and click R interface BUGS can be used from R by the same scripting language Results are exported to R for further analysis

How to point and click Follow these steps to specify a simple model and simulate the posterior distribution 1. Start BUGS : winbugs.exe 2. Create new document: File ->New 3. Write a model specification to the blank document: model{ x~dnorm(10,0.25) }

Running the simulation: 4. Model -> Specification… 5. Specification Tool: check model 6. [Activate the document or area that contains data] 7. [Specification tool: load data] 8. Specification tool: compile 9. [Activate the document or area that contains the set of initial values] 10. [Specification tool: load inits] 11. Specification tool: gen inits 12. Inference -> Samples… 13. Sample monitor tool: node : “x” 14. Sample monitor tool: set 15. Model -> Update… 16. Update tool: refresh: “1” 17. Update tool: update 18. Sample monitor tool: node: “x” 19. Sample monitor tool: history 20. Sample monitor tool: stats 21. Sample monitor tool: density