Inspiral Parameter Estimation via Markov Chain Monte Carlo (MCMC) Methods Nelson Christensen Carleton College LIGO-G020104-00-Z.

Slides:



Advertisements
Similar presentations
Motivating Markov Chain Monte Carlo for Multiple Target Tracking
Advertisements

Combining Tensor Networks with Monte Carlo: Applications to the MERA Andy Ferris 1,2 Guifre Vidal 1,3 1 University of Queensland, Australia 2 Université.
Bayesian Estimation in MARK
Bayesian calibration and comparison of process-based forest models Marcel van Oijen & Ron Smith (CEH-Edinburgh) Jonathan Rougier (Durham Univ.)
Efficient Cosmological Parameter Estimation with Hamiltonian Monte Carlo Amir Hajian Amir Hajian Cosmo06 – September 25, 2006 Astro-ph/
Pulsar Detection and Parameter Estimation with MCMC - Six Parameters Nelson Christensen Physics and Astronomy Carleton College GWDAW December 2003.
Gibbs Sampling Qianji Zheng Oct. 5th, 2010.
Markov-Chain Monte Carlo
Bayesian Methods with Monte Carlo Markov Chains III
Markov Chains 1.
GEO02 February GEO Binary Inspiral Search Analysis.
Markov Networks.
CHAPTER 16 MARKOV CHAIN MONTE CARLO
Reporter: Hsu Hsiang-Jung Modelling stochastic fish stock dynamics using Markov Chain Monte Carlo.
Bayesian statistics – MCMC techniques
Suggested readings Historical notes Markov chains MCMC details
BAYESIAN INFERENCE Sampling techniques
Symposium CoRoT 2009 – Cité Universitaire – 2 February 2009 Analysis of power spectra of Sun-like stars using a Bayesian Approach Thierry Appourchaux Symposium.
CS774. Markov Random Field : Theory and Application Lecture 16 Kyomin Jung KAIST Nov
Computing the Posterior Probability The posterior probability distribution contains the complete information concerning the parameters, but need often.
CENTER FOR BIOLOGICAL SEQUENCE ANALYSIS Bayesian Inference Anders Gorm Pedersen Molecular Evolution Group Center for Biological Sequence Analysis Technical.
Computational statistics 2009 Random walk. Computational statistics 2009 Random walk with absorbing barrier.
Journées LISA-France, Meudon, May 15-16, Bayesian Parameter Estimation Techniques for LISA Nelson Christensen, Carleton College, Northfield, Minnesota,
Machine Learning CUNY Graduate Center Lecture 7b: Sampling.
Yongjin Park, Stanley Shackney, and Russell Schwartz Accepted Computational Biology and Bioinformatics.
Today Introduction to MCMC Particle filters and MCMC
CENTER FOR BIOLOGICAL SEQUENCE ANALYSIS Bayesian Inference Anders Gorm Pedersen Molecular Evolution Group Center for Biological Sequence Analysis Technical.
Searching for pulsars using the Hough transform Badri Krishnan AEI, Golm (for the pulsar group) LSC meeting, Hanford November 2003 LIGO-G Z.
The moment generating function of random variable X is given by Moment generating function.
Using ranking and DCE data to value health states on the QALY scale using conventional and Bayesian methods Theresa Cain.
Robin McDougall, Ed Waller and Scott Nokleby Faculties of Engineering & Applied Science and Energy Systems & Nuclear Science 1.
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Bayes Factor Based on Han and Carlin (2001, JASA).
Overview G. Jogesh Babu. Probability theory Probability is all about flip of a coin Conditional probability & Bayes theorem (Bayesian analysis) Expectation,
Systematic effects in gravitational-wave data analysis
WSEAS AIKED, Cambridge, Feature Importance in Bayesian Assessment of Newborn Brain Maturity from EEG Livia Jakaite, Vitaly Schetinin and Carsten.
Introduction to MCMC and BUGS. Computational problems More parameters -> even more parameter combinations Exact computation and grid approximation become.
Prospects for the detection of sterile neutrinos with KATRIN-like experiments Anna Sejersen Riis, Steen Hannestad “Detecting sterile neutrinos with KATRIN.
Machine Learning Lecture 23: Statistical Estimation with Sampling Iain Murray’s MLSS lecture on videolectures.net:
Stochastic Monte Carlo methods for non-linear statistical inverse problems Benjamin R. Herman Department of Electrical Engineering City College of New.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
Markov Random Fields Probabilistic Models for Images
Simulation techniques Summary of the methods we used so far Other methods –Rejection sampling –Importance sampling Very good slides from Dr. Joo-Ho Choi.
G. Cowan RHUL Physics Bayesian Higgs combination page 1 Bayesian Higgs combination based on event counts (follow-up from 11 May 07) ATLAS Statistics Forum.
Analysis methods for Milky Way dark matter halo detection Aaron Sander 1, Larry Wai 2, Brian Winer 1, Richard Hughes 1, and Igor Moskalenko 2 1 Department.
Ch. 14: Markov Chain Monte Carlo Methods based on Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009.; C, Andrieu, N, de Freitas,
Application of the MCMC Method for the Calibration of DSMC Parameters James S. Strand and David B. Goldstein The University of Texas at Austin Sponsored.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
SUPA Advanced Data Analysis Course, Jan 6th – 7th 2009 Advanced Data Analysis for the Physical Sciences Dr Martin Hendry Dept of Physics and Astronomy.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Bayesian statistics named after the Reverend Mr Bayes based on the concept that you can estimate the statistical properties of a system after measuting.
Gil McVean, Department of Statistics Thursday February 12 th 2009 Monte Carlo simulation.
HW7: Evolutionarily conserved segments ENCODE region 009 (beta-globin locus) Multiple alignment of human, dog, and mouse 2 states: neutral (fast-evolving),
SIR method continued. SIR: sample-importance resampling Find maximum likelihood (best likelihood × prior), Y Randomly sample pairs of r and N 1973 For.
Monte Carlo Sampling to Inverse Problems Wojciech Dębski Inst. Geophys. Polish Acad. Sci. 1 st Orfeus workshop: Waveform inversion.
SEARCH FOR INSPIRALING BINARIES S. V. Dhurandhar IUCAA Pune, India.
CS498-EA Reasoning in AI Lecture #19 Professor: Eyal Amir Fall Semester 2011.
The Monte Carlo Method/ Markov Chains/ Metropolitan Algorithm from sec in “Adaptive Cooperative Systems” -summarized by Jinsan Yang.
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Reducing Photometric Redshift Uncertainties Through Galaxy Clustering
MCMC Output & Metropolis-Hastings Algorithm Part I
A 2 veto for Continuous Wave Searches
Coherent wide parameter space searches for gravitational waves from neutron stars using LIGO S2 data Xavier Siemens, for the LIGO Scientific Collaboration.
ERGM conditional form Much easier to calculate delta (change statistics)
Ray Tracing via Markov Chain Monte-Carlo Method
POINT ESTIMATOR OF PARAMETERS
Yuta Michimura Department of Physics, University of Tokyo
Markov Networks.
Presentation transcript:

Inspiral Parameter Estimation via Markov Chain Monte Carlo (MCMC) Methods Nelson Christensen Carleton College LIGO-G Z

Inspiral Detected by One Interferometer From data, estimate m 1, m 2 and amplitude of signal Generate a probability distribution function for these parameters => statistics

More Parameters - MCMC MCMC methods are a demonstrated way to deal with large parameter numbers. Expand one-interferometer problem to include terms like spins of the masses. Multiple interferometer problem: Source sky position and polarization of wave are additional parameters to estimate and to generate PDFs for.

Initial Study Used “off the shelf” MCMC software See Christensen and Meyer, PRD 64, (2001)

Present Work Develop a MCMC routine that operates within LAL Uses LAL routine “findchirp” with some modifications Using Metropolis-Hastings Algorithm

Bayes’ Theorem z = the data

Bayes – Normally Hard to Calculate The PDF for parameter  i Estimate for parameter  i

MCMC Does Integral Parameter space sampled in quasi-random fashion. Steps through parameter space are weighted by the likelihood and a priori distributions z = s + n Data is the sum of signal + noise

Markov Chain of Parameter Values Start with some initial parameter values: In some “random” way, select new candidate parameters: Calculate: Accept candidate as new chain member if  > 1 If  < 1 then accept candidate with probability =  If candidate is rejected, last value also becomes new chain value. Repeat 250,000 times or so.

Example from “off the shelf” software

Markov Chain Represents the PDF

New Metropolis-Hastings Routine

Metropolis-Hastings Algorithm Have applied this method to estimating 10 cosmological parameters from CMB data. See Knox, Christensen, Skordis, ApJ Lett 563, L95 (2001) Metropolis-Hastings algorithm described in Christensen et al., Class. Quant. Gravity 18, 2677 (2001)