The chi-squared statistic  2 N Measures “goodness of fit” Used for model fitting and hypothesis testing e.g. fitting a function C(p 1,p 2,...p M ; x)

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

Computational Statistics. Basic ideas  Predict values that are hard to measure irl, by using co-variables (other properties from the same measurement.
Postulates of Quantum Mechanics. The Fundamental Rules of Our Game Any measurement we can make with an experiment corresponds to a mathematical “operator”
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Sampling: Final and Initial Sample Size Determination
1 Regression Models & Loss Reserve Variability Prakash Narayan Ph.D., ACAS 2001 Casualty Loss Reserve Seminar.
SOLVED EXAMPLES.
Physics 114: Lecture 16 Linear and Non-Linear Fitting Dale E. Gary NJIT Physics Department.
Basic geostatistics Austin Troy.
Chapter 7(7b): Statistical Applications in Traffic Engineering Chapter objectives: By the end of these chapters the student will be able to (We spend 3.
Experimental Uncertainties: A Practical Guide What you should already know well What you need to know, and use, in this lab More details available in handout.
Model Estimation and Comparison Gamma and Lognormal Distributions 2015 Washington, D.C. Rock ‘n’ Roll Marathon Velocities.
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem, random variables, pdfs 2Functions.
Hypothesis Testing Steps of a Statistical Significance Test. 1. Assumptions Type of data, form of population, method of sampling, sample size.
Lec 6, Ch.5, pp90-105: Statistics (Objectives) Understand basic principles of statistics through reading these pages, especially… Know well about the normal.
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 8 1Probability, Bayes’ theorem, random variables, pdfs 2Functions of.
G. Cowan 2011 CERN Summer Student Lectures on Statistics / Lecture 41 Introduction to Statistics − Day 4 Lecture 1 Probability Random variables, probability.
4-1 Statistical Inference The field of statistical inference consists of those methods used to make decisions or draw conclusions about a population.
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 Least.
Notes on Weighted Least Squares Straight line Fit Passing Through The Origin Amarjeet Bhullar November 14, 2008.
Statistical Inference for Two Samples
LINEAR REGRESSION Introduction Section 0 Lecture 1 Slide 1 Lecture 5 Slide 1 INTRODUCTION TO Modern Physics PHYX 2710 Fall 2004 Intermediate 3870 Fall.
GEO7600 Inverse Theory 09 Sep 2008 Inverse Theory: Goals are to (1) Solve for parameters from observational data; (2) Know something about the range of.
Chapter 15 Modeling of Data. Statistics of Data Mean (or average): Variance: Median: a value x j such that half of the data are bigger than it, and half.
Prof. Dr. S. K. Bhattacharjee Department of Statistics University of Rajshahi.
G. Cowan 2009 CERN Summer Student Lectures on Statistics1 Introduction to Statistics − Day 4 Lecture 1 Probability Random variables, probability densities,
Chapter 7 Hypothesis testing. §7.1 The basic concepts of hypothesis testing  1 An example Example 7.1 We selected 20 newborns randomly from a region.
Geo479/579: Geostatistics Ch12. Ordinary Kriging (1)
Fitting a line to N data points – 1 If we use then a, b are not independent. To make a, b independent, compute: Then use: Intercept = optimally weighted.
Geog. 579: GIS and Spatial Analysis - Lecture 21 Overheads 1 Point Estimation: 3. Methods: 3.6 Ordinary Kriging Topics: Lecture 23: Spatial Interpolation.
Chapter 8 Curve Fitting.
Curve-Fitting Regression
Sample variance and sample error We learned recently how to determine the sample variance using the sample mean. How do we translate this to an unbiased.
ELEC 303 – Random Signals Lecture 18 – Classical Statistical Inference, Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 4, 2010.
Functions of random variables Sometimes what we can measure is not what we are interested in! Example: mass of binary-star system: We want M but can only.
1 2 nd Pre-Lab Quiz 3 rd Pre-Lab Quiz 4 th Pre-Lab Quiz.
HYPOTHESIS TESTING Distributions(continued); Maximum Likelihood; Parametric hypothesis tests (chi-squared goodness of fit, t-test, F-test) LECTURE 2 Supplementary.
Descriptive Statistics Used to describe a data set –Mean, minimum, maximum Usually include information on data variability (error) –Standard deviation.
Chapter 10 The t Test for Two Independent Samples.
Estimation. The Model Probability The Model for N Items — 1 The vector probability takes this form if we assume independence.
NON-LINEAR REGRESSION Introduction Section 0 Lecture 1 Slide 1 Lecture 6 Slide 1 INTRODUCTION TO Modern Physics PHYX 2710 Fall 2004 Intermediate 3870 Fall.
1 Introduction to Statistics − Day 4 Glen Cowan Lecture 1 Probability Random variables, probability densities, etc. Lecture 2 Brief catalogue of probability.
1 Introduction to Statistics − Day 3 Glen Cowan Lecture 1 Probability Random variables, probability densities, etc. Brief catalogue of probability densities.
Lecture 1: Basic Statistical Tools. A random variable (RV) = outcome (realization) not a set value, but rather drawn from some probability distribution.
G. Cowan Computing and Statistical Data Analysis / Stat 9 1 Computing and Statistical Data Analysis Stat 9: Parameter Estimation, Limits London Postgraduate.
Maximum likelihood estimators Example: Random data X i drawn from a Poisson distribution with unknown  We want to determine  For any assumed value of.
G. Cowan Lectures on Statistical Data Analysis Lecture 9 page 1 Statistical Data Analysis: Lecture 9 1Probability, Bayes’ theorem 2Random variables and.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
Sample Size Needed to Achieve High Confidence (Means)
Lec. 19 – Hypothesis Testing: The Null and Types of Error.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
GIS and Spatial Analysis1 Summary  Parametric Test  Interval/ratio data  Based on normal distribution  Difference in Variances  Differences are just.
R. Kass/Sp07P416/Lecture 71 More on Least Squares Fit (LSQF) In Lec 5, we discussed how we can fit our data points to a linear function (straight line)
Bias-Variance Analysis in Regression  True function is y = f(x) +  where  is normally distributed with zero mean and standard deviation .  Given a.
ELG5377 Adaptive Signal Processing Lecture 13: Method of Least Squares.
Notes on Weighted Least Squares Straight line Fit Passing Through The Origin Amarjeet Bhullar November 14, 2008.
Part 5 - Chapter
Part 5 - Chapter 17.
Parameter Estimation and Fitting to Data
Review Guppies can swim an average of 8 mph, with a standard deviation of 2 mph You measure 100 guppies and compute their mean What is the standard error.
Part 5 - Chapter 17.
Today (2/11/16) Learning objectives (Sections 5.1 and 5.2):
Probability & Statistics Probability Theory Mathematical Probability Models Event Relationships Distributions of Random Variables Continuous Random.
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Today (2/23/16) Learning objectives:
Hypothesis Testing.
Nonlinear Fitting.
Adding variables. There is a difference between assessing the statistical significance of a variable acting alone and a variable being added to a model.
Uncertainty Propagation
Presentation transcript:

The chi-squared statistic  2 N Measures “goodness of fit” Used for model fitting and hypothesis testing e.g. fitting a function C(p 1,p 2,...p M ; x) to a set of data pairs (x i,y i ) where the y i have associated uncertainties  i : Define statistic: If C has M fitting parameters, expect  2 ~ N - M

 2 fitting approach Consider a set of data points X i with a common mean and individual errors  i We’ve already seen that the weighted average: Alternatively use goodness of fit: Find the value of A that minimises  

Parameter fitting by minimizing  2 Set derivative of   w.r.t. A to zero and solve: In other words, the optimally weighted average also minimizes  .

Using  2 to estimate parameter uncertainties Variance of optimally weighted average: What is   for Use Taylor series: Now So Hence: A 22  2 min

Error bars from  2 curvature We’ve just seen that: Hence  2 ≤1 encloses 68% of probability for A. We use  2 ≤1 to get “1  ” error bars on the value of a single parameter fitted to data. Use the second derivative (curvature): For the case where We get

Scaling a profile by  2 minimization As before: –X i = data, known. –  i = error bars, known. –p i = profile, known. –A p i = profile scaled by factor A. Goodness of fit:

Error bar on scale factor Use the  2 curvature method. Second derivative: Use  2 = 1: A 22  2 min