PHYSICS 2150 LABORATORY Dr. Laurel Mayhew TA: Geruo A Lab Coordinator: Jerry Leigh Lecture 4b February 27, 2007.

Slides:



Advertisements
Similar presentations
Regression Analysis.
Advertisements

Design of Experiments Lecture I
Correlation and regression Dr. Ghada Abo-Zaid
1 SSS II Lecture 1: Correlation and Regression Graduate School 2008/2009 Social Science Statistics II Gwilym Pryce
Correlation and Regression
Physics 270 – Experimental Physics. Let say we are given a functional relationship between several measured variables Q(x, y, …) What is the uncertainty.
Confidence Intervals for Proportions
The Simple Linear Regression Model: Specification and Estimation
Correlation and Autocorrelation
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Designing Experiments In designing experiments we: Manipulate the independent.
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem, random variables, pdfs 2Functions.
Copyright (c) Bani K. Mallick1 STAT 651 Lecture #18.
Statistics.
Chapter Topics Types of Regression Models
1 Simple Linear Regression Chapter Introduction In this chapter we examine the relationship among interval variables via a mathematical equation.
Chapter 11 Multiple Regression.
Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections ): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence.
Statistics and Probability Theory Prof. Dr. Michael Havbro Faber
Copyright © 2010 Pearson Education, Inc. Chapter 19 Confidence Intervals for Proportions.
Statistical Treatment of Data Significant Figures : number of digits know with certainty + the first in doubt. Rounding off: use the same number of significant.
Chapter 7 Probability and Samples: The Distribution of Sample Means
Review for Exam 2 Some important themes from Chapters 6-9 Chap. 6. Significance Tests Chap. 7: Comparing Two Groups Chap. 8: Contingency Tables (Categorical.
Introduction to Regression Analysis, Chapter 13,
Relationships Among Variables
Business Statistics: Communicating with Numbers
Objectives of Multiple Regression
Physics 114: Lecture 17 Least Squares Fit to Polynomial
AM Recitation 2/10/11.
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
Linear Regression and Correlation
Correlation and Linear Regression
LINEAR REGRESSION Introduction Section 0 Lecture 1 Slide 1 Lecture 5 Slide 1 INTRODUCTION TO Modern Physics PHYX 2710 Fall 2004 Intermediate 3870 Fall.
Hypothesis Testing in Linear Regression Analysis
Introduction to Statistical Inferences
Copyright © Cengage Learning. All rights reserved. 13 Linear Correlation and Regression Analysis.
CPE 619 Simple Linear Regression Models Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department The University of Alabama.
Simple Linear Regression Models
Chapter 15 Modeling of Data. Statistics of Data Mean (or average): Variance: Median: a value x j such that half of the data are bigger than it, and half.
880.P20 Winter 2006 Richard Kass 1 Confidence Intervals and Upper Limits Confidence intervals (CI) are related to confidence limits (CL). To calculate.
R. Kass/W03P416/Lecture 7 1 Lecture 7 Some Advanced Topics using Propagation of Errors and Least Squares Fitting Error on the mean (review from Lecture.
Inferential Statistics 2 Maarten Buis January 11, 2006.
Physics 114: Exam 2 Review Lectures 11-16
CS433: Modeling and Simulation Dr. Anis Koubâa Al-Imam Mohammad bin Saud University 15 October 2010 Lecture 05: Statistical Analysis Tools.
Chapter 20 Linear Regression. What if… We believe that an important relation between two measures exists? For example, we ask 5 people about their salary.
MGS3100_04.ppt/Sep 29, 2015/Page 1 Georgia State University - Confidential MGS 3100 Business Analysis Regression Sep 29 and 30, 2015.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Physics 270 – Experimental Physics. Let say we are given a functional relationship between several measured variables Q(x, y, …) x ±  x and x ±  y What.
Radiation Detection and Measurement, JU, First Semester, (Saed Dababneh). 1 Counting Statistics and Error Prediction Poisson Distribution ( p.
Copyright © Cengage Learning. All rights reserved. 13 Linear Correlation and Regression Analysis.
Chapter 9: Correlation and Regression Analysis. Correlation Correlation is a numerical way to measure the strength and direction of a linear association.
Copyright © 2014 by McGraw-Hill Higher Education. All rights reserved. Essentials of Business Statistics: Communicating with Numbers By Sanjiv Jaggia and.
Objectives 2-1 to Describe the motion of an object relative to a particular frame of reference. 2.Define and calculate displacement (Δx) as the change.
Radiation Detection and Measurement, JU, 1st Semester, (Saed Dababneh). 1 Radioactive decay is a random process. Fluctuations. Characterization.
Chapter 8: Simple Linear Regression Yang Zhenlin.
NON-LINEAR REGRESSION Introduction Section 0 Lecture 1 Slide 1 Lecture 6 Slide 1 INTRODUCTION TO Modern Physics PHYX 2710 Fall 2004 Intermediate 3870 Fall.
1 Introduction to Statistics − Day 4 Glen Cowan Lecture 1 Probability Random variables, probability densities, etc. Lecture 2 Brief catalogue of probability.
ES 07 These slides can be found at optimized for Windows)
Joint Moments and Joint Characteristic Functions.
G. Cowan Computing and Statistical Data Analysis / Stat 9 1 Computing and Statistical Data Analysis Stat 9: Parameter Estimation, Limits London Postgraduate.
R.Kass/F02 P416 Lecture 1 1 Lecture 1 Probability and Statistics Introduction: l The understanding of many physical phenomena depend on statistical and.
CORRELATION ANALYSIS.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 19 Confidence Intervals for Proportions.
CHAPTER- 3.2 ERROR ANALYSIS. 3.3 SPECIFIC ERROR FORMULAS  The expressions of Equations (3.13) and (3.14) were derived for the general relationship of.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Statistics 19 Confidence Intervals for Proportions.
1 Ka-fu Wong University of Hong Kong A Brief Review of Probability, Statistics, and Regression for Forecasting.
R. Kass/Sp07P416/Lecture 71 More on Least Squares Fit (LSQF) In Lec 5, we discussed how we can fit our data points to a linear function (straight line)
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Lecture 2 Survey Data Analysis Principal Component Analysis Factor Analysis Exemplified by SPSS Taylan Mavruk.
MGS 3100 Business Analysis Regression Feb 18, 2016
Presentation transcript:

PHYSICS 2150 LABORATORY Dr. Laurel Mayhew TA: Geruo A Lab Coordinator: Jerry Leigh Lecture 4b February 27, 2007

TOPICS Fitting exponentials and other generic functions Covariance and correlation

FITTING TO AN EXPONENTIAL Some functions can be easily “converted” to linear and then fitted to a line. Exponential decay shows up in many places, especially radioactive decays Decay rate drops with time: N(t) = N 0 exp(−t/τ) where τ is the mean lifetime; half-life is related by factor of ln (2) However, ln (N) = ln (N 0 ) − t/τ which is linear! So, can do a linear fit to ln (N) and extract τ

FITTING TO AN EXPONENTIAL

FITTING TO A MORE GENERAL FUNCTION Possible to fit a completely general function f(x) where f is dependent on parameters A, B, C,... Obtain N measurements y i for N values of x Construct the chi-squared Scan over possible values of the parameters A, B, C,... and find the values that cause χ 2 to be minimized These values are the most likely values of the parameters. Uncertainties on the parameters are determined from how far you have to scan away from the best values to cause χ 2 to increase by 1.

CHI-SQUARED MINIMIZATION How do we know if we got a good fit (i.e., does the function f describe the data adequately)? Can ask how small the minimum χ 2 value was. Smaller values indicate a better fit (fewer, smaller deviations between the data points and the best fit function) Actual numerical expectations for depend on the number N of data points and the number of parameters (A, B, C,...) in the function f. More on this in Chapter 12.

COVARIANCE AND CORRELATION Take a desired measurement q(x,y) Error propagation says Making two assumptions here: errors are gaussian and x and y are uncorrelated. If there is a correlation, the error on q can be either larger or smaller than our estimate Sometimes (epidemiology, other complex systems) the correlation itself is something interesting to learn

AN EXAMPLE OF CORRELATED VARIABLES: THE K 0 MASS Incoming K + hits stationary neutron, producing proton and K 0 The K 0 travels a short distance and decays to π + π − Need to measure angle θ T between pions Also need to measure θ ±, angles between pions and K 0 Measuring θ T is easy, but K 0 direction can be hard if its path is short

AN EXAMPLE OF CORRELATED VARIABLES: THE K 0 MASS If direction of the blue line is wrong, then θ + and θ − will be wrong by equal amounts but in opposite directions θ T = θ + + θ − will still be OK Thus we can say that our measurements of θ + and θ − will be correlated (actually anticorrelated).

COVARIANCE AND ERROR PROPAGATION Note: When the variables were independent, this term was 0.

COVARIANCE AND ERROR PROPAGATION If the covariance σ xy is small compared to δx or δy, then the error reduces as expected to standard addition in quadrature If the covariance σ xy is negative, then the error on the result is actually smaller than from addition in quadrature!

COVARIANCE VS. CORRELATION The covariance σ xy can be normalized to create a correlation coefficient r = σ xy / σ x σ y r can vary between −1 and 1. r = 0 indicates that the variables are uncorrelated; |r| = 1 means the variables are completely correlated (i.e. knowing the value of x completely determines the value of y). Sign indicates direction of covariance: positive means that large x indicates y is likely large; negative means that large x indicates y is likely small.

CORRELATION AS A MEASUREMENT Sometimes the correlation itself is interesting. In order to establish the significance of the correlation, need to ask what r is, as well as how many measurements were taken to establish it. For given r and N, look in table (Taylor Appendix C) to find out probability of randomly measuring a particular correlation value Random probabilities: is there evidence of linear correlation between the two variables? <5% = significant evidence highly significant <1% = highly significant evidence