What is fiducial inference - and why?

Slides:



Advertisements
Similar presentations
T HE ‘N ORMAL ’ D ISTRIBUTION. O BJECTIVES Review the Normal Distribution Properties of the Standard Normal Distribution Review the Central Limit Theorem.
Advertisements

Brief introduction on Logistic Regression
Week 11 Review: Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution.
CHAPTER 8 More About Estimation. 8.1 Bayesian Estimation In this chapter we introduce the concepts related to estimation and begin this by considering.
Bayesian inference “Very much lies in the posterior distribution” Bayesian definition of sufficiency: A statistic T (x 1, …, x n ) is sufficient for 
Psychology 290 Special Topics Study Course: Advanced Meta-analysis April 7, 2014.
CHAPTER 21 Inferential Statistical Analysis. Understanding probability The idea of probability is central to inferential statistics. It means the chance.
A Brief Introduction to Bayesian Inference Robert Van Dine 1.
Chapter 8: Estimating with Confidence
Binomial Distribution & Bayes’ Theorem. Questions What is a probability? What is the probability of obtaining 2 heads in 4 coin tosses? What is the probability.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Jensen’s Inequality (Special Case) EM Theorem.
Bayesian inference Gil McVean, Department of Statistics Monday 17 th November 2008.
Deciding, Estimating, Computing, Checking How are Bayesian posteriors used, computed and validated?
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
AP Statistics Section 10.2 A CI for Population Mean When is Unknown.
G. Cowan Lectures on Statistical Data Analysis Lecture 10 page 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem 2Random variables and.
Statistics for Marketing & Consumer Research Copyright © Mario Mazzocchi 1 Further advanced methods Chapter 17.
Lecture 9: p-value functions and intro to Bayesian thinking Matthew Fox Advanced Epidemiology.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 12 Analyzing the Association Between Quantitative Variables: Regression Analysis Section.
Additional Slides on Bayesian Statistics for STA 101 Prof. Jerry Reiter Fall 2008.
ECE 8443 – Pattern Recognition LECTURE 06: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Bias in ML Estimates Bayesian Estimation Example Resources:
Statistical Decision Theory
Prof. Dr. S. K. Bhattacharjee Department of Statistics University of Rajshahi.
Statistics for Data Miners: Part I (continued) S.T. Balke.
Estimating parameters in a statistical model Likelihood and Maximum likelihood estimation Bayesian point estimates Maximum a posteriori point.
Applied Bayesian Inference, KSU, April 29, 2012 § ❷ / §❷ An Introduction to Bayesian inference Robert J. Tempelman 1.
Bayesian Inversion of Stokes Profiles A.Asensio Ramos (IAC) M. J. Martínez González (LERMA) J. A. Rubiño Martín (IAC) Beaulieu Workshop ( Beaulieu sur.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
G. Cowan Lectures on Statistical Data Analysis Lecture 1 page 1 Lectures on Statistical Data Analysis London Postgraduate Lectures on Particle Physics;
Practical Statistics for Particle Physicists Lecture 3 Harrison B. Prosper Florida State University European School of High-Energy Physics Anjou, France.
Maximum Likelihood - "Frequentist" inference x 1,x 2,....,x n ~ iid N( ,  2 ) Joint pdf for the whole random sample Maximum likelihood estimates.
Bayesian vs. frequentist inference frequentist: 1) Deductive hypothesis testing of Popper--ruling out alternative explanations Falsification: can prove.
Frequentistic approaches in physics: Fisher, Neyman-Pearson and beyond Alessandro Palma Dottorato in Fisica XXII ciclo Corso di Probabilità e Incertezza.
- 1 - Bayesian inference of binomial problem Estimating a probability from binomial data –Objective is to estimate unknown proportion (or probability of.
Bayesian Inference, Review 4/25/12 Frequentist inference Bayesian inference Review The Bayesian Heresy (pdf)pdf Professor Kari Lock Morgan Duke University.
Statistical Decision Theory Bayes’ theorem: For discrete events For probability density functions.
Confidence Interval Estimation For statistical inference in decision making:
"Classical" Inference. Two simple inference scenarios Question 1: Are we in world A or world B?
Ch15: Decision Theory & Bayesian Inference 15.1: INTRO: We are back to some theoretical statistics: 1.Decision Theory –Make decisions in the presence of.
Sampling and estimation Petter Mostad
Bayes Theorem. Prior Probabilities On way to party, you ask “Has Karl already had too many beers?” Your prior probabilities are 20% yes, 80% no.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Practical Statistics for Particle Physicists Lecture 2 Harrison B. Prosper Florida State University European School of High-Energy Physics Parádfürdő,
POLS 7000X STATISTICS IN POLITICAL SCIENCE CLASS 5 BROOKLYN COLLEGE-CUNY SHANG E. HA Leon-Guerrero and Frankfort-Nachmias, Essentials of Statistics for.
Stats Term Test 4 Solutions. c) d) An alternative solution is to use the probability mass function and.
- 1 - Outline Introduction to the Bayesian theory –Bayesian Probability –Bayes’ Rule –Bayesian Inference –Historical Note Coin trials example Bayes rule.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
G. Cowan Lectures on Statistical Data Analysis Lecture 10 page 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem 2Random variables and.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Outline Historical note about Bayes’ rule Bayesian updating for probability density functions –Salary offer estimate Coin trials example Reading material:
CHAPTER 6: SAMPLING, SAMPLING DISTRIBUTIONS, AND ESTIMATION Leon-Guerrero and Frankfort-Nachmias, Essentials of Statistics for a Diverse Society.
Bayesian Estimation and Confidence Intervals Lecture XXII.
Lecture Slides Elementary Statistics Twelfth Edition
Confidence Intervals.
Inference: Conclusion with Confidence
Data Analysis Patrice Koehl Department of Biological Sciences
Bayesian Estimation and Confidence Intervals
Bayesian data analysis
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Binomial Distribution & Bayes’ Theorem
More about Posterior Distributions
(Very Brief) Introduction to Bayesian Statistics
Section 10.1: Confidence Intervals
Using Simulation Methods to Introduce Inference
Lecture 4: Econometric Foundations
LECTURE 07: BAYESIAN ESTIMATION
Bayes for Beginners Luca Chech and Jolanda Malamud
CS639: Data Management for Data Science
Bayesian Statistics on a Shoestring Assaf Oron, May 2008
Applied Statistics and Probability for Engineers
Presentation transcript:

What is fiducial inference - and why? Gunnar Taraldsen Statistics seminar at NTNU, October 17th 2016

Fidus inferens = troverdig inferens Fidus fordeling = troverdighet Forslag til norske ord Fidus inferens = troverdig inferens Fidus fordeling = troverdighet

Abstract This seminar contribution – with active discussion with the audience - presents some of the historic background, the original example presented by Fisher, and recent developments and trends as seen by members of the BFF group (Bayes-Fiducial-Frequentist = Best-Friends- Forever) and recent and upcoming JASA publications.

BFF: Min-ge Xie et al BFF http://stat.rutgers.edu/bff2016-program

BFF = Best-Friends-Forever = Bayes-Fiducial-Frequentist

What is fiducial inference? Fisher’s biggest blunder. Essentially dead. (Pederson, 1978) A big hit in the 21th century (Efron, 1998) A tool for fusion learning (Regina Liu, 2015) Data dependent priors ++(Hannig, 2016) Inferential models (Ryan Martin, 2015) Dempster (1967)-Shafer (1976) calculus and belief and plausibility functions Confidence: Birnbaum (1961), Schweder-Hjort (2016).

Why fiducial inference? Fisher idea: A mode of inference that produces a posterior epistemic distribution for cases without prior epistemic probability. Introduced by Fisher in 1930 and his arguments is then seemingly the first formulation of what a confidence distribution is. He argued later that it should not be a confidence distribution (1973). An alternative to Bayesian and frequentist inference. Fraser introduced the term structural inference – and this gives a very promissing idea for a general theory. This will be presented at the end, but first history and some alternatives ….

Biologist and statistician Ronald Fisher Fiducial inference is important in the history of statistics since its development led to the parallel development of concepts and tools in theoretical statistics that are widely used. It was invented in 1930 by Fisher in the paper “Inverse probability”. The paper considers the correlation coefficient in the bivariate normal and the argument gives a confidence distribution as the fiducial. Neyman (1937) introduced the idea of "confidence" in his paper on confidence intervals: the frequentist property.

Arthur P. Dempster at the Workshop on Theory of Belief Functions (Brest, 1 April 2010).

Zabell, S. L. (Aug 1992). "R. A. Fisher and Fiducial Argument" Zabell, S. L. (Aug 1992). "R. A. Fisher and Fiducial Argument". Statistical Science. 7 (3): 369–387 Fisher admitted that "fiducial inference" had problems. Fisher wrote to George A. Barnard that he was "not clear in the head" about one problem on fiducial inference, and, also writing to Barnard, Fisher complained that his theory seemed to have only "an asymptotic approach to intelligibility". Later Fisher confessed that "I don't understand yet what fiducial probability does. We shall have to live with it a long time before we know what it's doing for us. But it should not be ignored just because we don't yet have a clear interpretation".

Hannig et al (JASA 2016, accepted): Generalized Fiducial Inference: A Review and New Results

Hannig et al (2016): 1 The idea behind GFD is very similar to the idea behind the likelihood function: what is the chance of observing my data if any given parameter was true. The added value of GFD is that it provides likelihood function with an appropriate Jacobian obtaining a proper probability distribution on the parameter space.

Hannig et al (2016): 2 GFD does not presume that the parameter is random. Instead it should be viewed as a distribution estimator (rather than a point or interval estimator) of the fixed true parameter. To validate this distribution estimator in a specific example we then typically demonstrate good small sample performance by simulation and prove good large sample properties by asymptotic theorems.

Hannig et al (2016): 3 From a Bayesian point of view, Bayes theorem updates the distribution of U after the data are observed. However, when no prior information is present, changing the distribution of U only by restricting it to the set “there is at least one θ solving the equation y = G(U,θ)” seems to us as a reasonable choice

Hannig et al (2016)

Hannig et al (2016)

Schweder-Hjort (2016) book Defines confidence inference and develops its basic theory Includes many worked examples of/with confidence inference, with emphasis on the confidence curve as a good format of reporting Presents methods for meta-analysis and other forms of combining information, which goes beyond present day theory based on approximate normality

Taraldsen-Lindqvist (2013): Optimal rule

Fiducial inference and structure The structure of the fiducial equation is included. Information in the fiducial equation? Additional parameters a’la Fraser. Copulae, SEM, and other structural models can be studied to clearify this point of view: The main difference between Bayesian and fiducial inference is that the later takes the structure of the fiducial model into account – and not just the resulting statistical model. A fiducial model is more than a Bayesian model and generalizes Bayesian inference.

THE END