1 Introduction to Statistics − Day 2 Glen Cowan Lecture 1 Probability Random variables, probability densities, etc. Brief catalogue of probability densities.

Slides:



Advertisements
Similar presentations
Computing and Statistical Data Analysis / Stat 4
Advertisements

Pattern Recognition and Machine Learning
Statistical Data Analysis / Stat 2
8. Statistical tests 8.1 Hypotheses K. Desch – Statistical methods of data analysis SS10 Frequent problem: Decision making based on statistical information.
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem, random variables, pdfs 2Functions.
CF-3 Bank Hapoalim Jun-2001 Zvi Wiener Computational Finance.
G. Cowan Lectures on Statistical Data Analysis Lecture 2 page 1 Statistical Data Analysis: Lecture 2 1Probability, Bayes’ theorem 2Random variables and.
8. Hypotheses 8.4 Two more things K. Desch – Statistical methods of data analysis SS10 Inclusion of systematic errors LHR methods needs a prediction (from.
G. Cowan 2007 CERN Summer Student Lectures on Statistics1 Introduction to Statistics − Day 3 Lecture 1 Probability Random variables, probability densities,
Statistics.
Machine Learning CMPT 726 Simon Fraser University
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 6 1Probability, Bayes’ theorem, random variables, pdfs 2Functions of.
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 8 1Probability, Bayes’ theorem, random variables, pdfs 2Functions of.
G. Cowan 2011 CERN Summer Student Lectures on Statistics / Lecture 41 Introduction to Statistics − Day 4 Lecture 1 Probability Random variables, probability.
G. Cowan RHUL Physics Statistical Methods for Particle Physics / 2007 CERN-FNAL HCP School page 1 Statistical Methods for Particle Physics CERN-FNAL Hadron.
G. Cowan Lectures on Statistical Data Analysis Lecture 10 page 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem 2Random variables and.
Introduction to Statistics − Day 2
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 7 1Probability, Bayes’ theorem, random variables, pdfs 2Functions of.
Lecture II-2: Probability Review
1 Linear Methods for Classification Lecture Notes for CMPUT 466/551 Nilanjan Ray.
G. Cowan Lectures on Statistical Data Analysis Lecture 7 page 1 Statistical Data Analysis: Lecture 7 1Probability, Bayes’ theorem 2Random variables and.
880.P20 Winter 2006 Richard Kass 1 Confidence Intervals and Upper Limits Confidence intervals (CI) are related to confidence limits (CL). To calculate.
G. Cowan Lectures on Statistical Data Analysis Lecture 1 page 1 Lectures on Statistical Data Analysis YETI IPPP Durham Young Experimentalists and.
Uniovi1 Some distributions Distribution/pdfExample use in HEP BinomialBranching ratio MultinomialHistogram with fixed N PoissonNumber of events found UniformMonte.
G. Cowan 2009 CERN Summer Student Lectures on Statistics1 Introduction to Statistics − Day 4 Lecture 1 Probability Random variables, probability densities,
G. Cowan Lectures on Statistical Data Analysis Lecture 3 page 1 Lecture 3 1 Probability (90 min.) Definition, Bayes’ theorem, probability densities and.
Irakli Chakaberia Final Examination April 28, 2014.
Module 1: Statistical Issues in Micro simulation Paul Sousa.
Experimental Method and Data Process: “Monte Carlo Method” Presentation # 1 Nafisa Tasneem CHEP,KNU
1 Introduction to Statistical Methods for High Energy Physics Glen Cowan 2006 CERN Summer Student Lectures CERN Summer Student Lectures on Statistics Glen.
G. Cowan Lectures on Statistical Data Analysis Lecture 1 page 1 Lectures on Statistical Data Analysis London Postgraduate Lectures on Particle Physics;
Monte Carlo Methods So far we have discussed Monte Carlo methods based on a uniform distribution of random numbers on the interval [0,1] p(x) = 1 0  x.
G. Cowan CERN Academic Training 2010 / Statistics for the LHC / Lecture 11 Statistics for the LHC Lecture 1: Introduction Academic Training Lectures CERN,
G. Cowan CLASHEP 2011 / Topics in Statistical Data Analysis / Lecture 21 Topics in Statistical Data Analysis for HEP Lecture 2: Statistical Tests CERN.
A statistical test for point source searches - Aart Heijboer - AWG - Cern june 2002 A statistical test for point source searches Aart Heijboer contents:
G. Cowan CLASHEP 2011 / Topics in Statistical Data Analysis / Lecture 1 1 Topics in Statistical Data Analysis for HEP Lecture 1: Parameter Estimation CERN.
G. Cowan Lectures on Statistical Data Analysis Lecture 2 page 1 Lecture 2 1 Probability Definition, Bayes’ theorem, probability densities and their properties,
G. Cowan 2011 CERN Summer Student Lectures on Statistics / Lecture 31 Introduction to Statistics − Day 3 Lecture 1 Probability Random variables, probability.
G. Cowan Lectures on Statistical Data Analysis Lecture 1 page 1 Lectures on Statistical Data Analysis RWTH Aachen Graduiertenkolleg February, 2007.
Computer simulation Sep. 9, QUIZ 2 Determine whether the following experiments have discrete or continuous out comes A fair die is tossed and the.
G. Cowan Lectures on Statistical Data Analysis Lecture 4 page 1 Statistical Data Analysis: Lecture 4 1Probability, Bayes’ theorem 2Random variables and.
1 Introduction to Statistics − Day 4 Glen Cowan Lecture 1 Probability Random variables, probability densities, etc. Lecture 2 Brief catalogue of probability.
G. Cowan Statistical Data Analysis / Stat 2 1 Statistical Data Analysis Stat 2: Monte Carlo Method, Statistical Tests London Postgraduate Lectures on Particle.
G. Cowan Lectures on Statistical Data Analysis Lecture 8 page 1 Statistical Data Analysis: Lecture 8 1Probability, Bayes’ theorem 2Random variables and.
1 Introduction to Statistics − Day 3 Glen Cowan Lecture 1 Probability Random variables, probability densities, etc. Brief catalogue of probability densities.
G. Cowan TAE 2010 / Statistics for HEP / Lecture 11 Statistical Methods for HEP Lecture 1: Introduction Taller de Altas Energías 2010 Universitat de Barcelona.
G. Cowan Computing and Statistical Data Analysis / Stat 9 1 Computing and Statistical Data Analysis Stat 9: Parameter Estimation, Limits London Postgraduate.
G. Cowan IDPASC School of Flavour Physics, Valencia, 2-7 May 2013 / Statistical Analysis Tools 1 Statistical Analysis Tools for Particle Physics IDPASC.
G. Cowan Lectures on Statistical Data Analysis Lecture 9 page 1 Statistical Data Analysis: Lecture 9 1Probability, Bayes’ theorem 2Random variables and.
G. Cowan Lectures on Statistical Data Analysis Lecture 6 page 1 Statistical Data Analysis: Lecture 6 1Probability, Bayes’ theorem 2Random variables and.
G. Cowan Lectures on Statistical Data Analysis Lecture 5 page 1 Statistical Data Analysis: Lecture 5 1Probability, Bayes’ theorem 2Random variables and.
G. Cowan Lectures on Statistical Data Analysis Lecture 12 page 1 Statistical Data Analysis: Lecture 12 1Probability, Bayes’ theorem 2Random variables and.
G. Cowan Lectures on Statistical Data Analysis Lecture 10 page 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem 2Random variables and.
G. Cowan Aachen 2014 / Statistics for Particle Physics, Lecture 21 Statistical Methods for Particle Physics Lecture 2: statistical tests, multivariate.
G. Cowan Statistical methods for HEP / Freiburg June 2011 / Lecture 1 1 Statistical Methods for Discovery and Limits in HEP Experiments Day 1: Introduction,
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 1: INTRODUCTION.
Helge VossAdvanced Scientific Computing Workshop ETH Multivariate Methods of data analysis Helge Voss Advanced Scientific Computing Workshop ETH.
Introduction to Statistics − Day 2
LECTURE 09: BAYESIAN ESTIMATION (Cont.)
Multi-dimensional likelihood
TAE 2017 / Statistics Lecture 1
Computing and Statistical Data Analysis / Stat 8
Computing and Statistical Data Analysis Stat 3: The Monte Carlo Method
Lecture 3 1 Probability Definition, Bayes’ theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests general.
Computing and Statistical Data Analysis Stat 5: Multivariate Methods
TAE 2018 Benasque, Spain 3-15 Sept 2018 Glen Cowan Physics Department
Computing and Statistical Data Analysis / Stat 7
TRISEP 2016 / Statistics Lecture 2
Introduction to Statistics − Day 4
Presentation transcript:

1 Introduction to Statistics − Day 2 Glen Cowan Lecture 1 Probability Random variables, probability densities, etc. Brief catalogue of probability densities Lecture 2 The Monte Carlo method Statistical tests Fisher discriminants, neural networks, etc. Lecture 3 Goodness-of-fit tests Parameter estimation Maximum likelihood and least squares Interval estimation (setting limits) CERN Summer Student Lectures on Statistics →

2 What it is: a numerical technique for calculating probabilities and related quantities using sequences of random numbers. The usual steps: (1) Generate sequence r 1, r 2,..., r m uniform in [0, 1]. (2) Use this to produce another sequence x 1, x 2,..., x n distributed according to some pdf f (x) in which we’re interested (x can be a vector). (3) Use the x values to estimate some property of f (x), e.g., fraction of x values with a < x < b gives → MC calculation = integration (at least formally) MC generated values = ‘simulated data’ → use for testing statistical procedures The Monte Carlo method Glen CowanCERN Summer Student Lectures on Statistics

3 Random number generators Glen CowanCERN Summer Student Lectures on Statistics Goal: generate uniformly distributed values in [0, 1]. Toss coin for e.g. 32 bit number... (too tiring). → ‘random number generator’ = computer algorithm to generate r 1, r 2,..., r n. Example: multiplicative linear congruential generator (MLCG) n i+1 = (a n i ) mod m, where n i = integer a = multiplier m = modulus n 0 = seed (initial value) N.B. mod = modulus (remainder), e.g. 27 mod 5 = 2. This rule produces a sequence of numbers n 0, n 1,...

4 Random number generators (2) Glen CowanCERN Summer Student Lectures on Statistics The sequence is (unfortunately) periodic! Example (see Brandt Ch 4): a = 3, m = 7, n 0 = 1 ← sequence repeats Choose a, m to obtain long period (maximum = m  1); m usually close to the largest integer that can represented in the computer. Only use a subset of a single period of the sequence.

5 Random number generators (3) Glen CowanCERN Summer Student Lectures on Statistics are in [0, 1] but are they ‘random’? Choose a, m so that the r i pass various tests of randomness: uniform distribution in [0, 1], all values independent (no correlations between pairs), e.g. L’Ecuyer, Commun. ACM 31 (1988) 742 suggests a = m = Far better algorithms available, e.g. RANMAR, period See F. James, Comp. Phys. Comm. 60 (1990) 111; Brandt Ch. 4

6 The transformation method Glen CowanCERN Summer Student Lectures on Statistics Given r 1, r 2,..., r n uniform in [0, 1], find x 1, x 2,..., x n that follow f (x) by finding a suitable transformation x (r). Require: i.e. That is, setand solve for x (r).

7 Example of the transformation method Glen CowanCERN Summer Student Lectures on Statistics Exponential pdf: Set and solve for x (r). → works too.)

8 The acceptance-rejection method Glen CowanCERN Summer Student Lectures on Statistics Enclose the pdf in a box: (1) Generate a random number x, uniform in [x min, x max ], i.e. r 1 is uniform in [0,1]. (2) Generate a 2nd independent random number u uniformly distributed between 0 and f max, i.e. (3) If u < f (x), then accept x. If not, reject x and repeat.

9 Example with acceptance-rejection method Glen CowanCERN Summer Student Lectures on Statistics If dot below curve, use x value in histogram.

10 Monte Carlo event generators Glen CowanCERN Summer Student Lectures on Statistics Simple example: e  e  →     Generate cos  and  : Less simple: ‘event generators’ for a variety of reactions: e + e - →    , hadrons,... pp → hadrons, D-Y, SUSY,... e.g. PYTHIA, HERWIG, ISAJET... Output = ‘events’, i.e., for each event we get a list of generated particles and their momentum vectors, types, etc.

11 Monte Carlo detector simulation Glen CowanCERN Summer Student Lectures on Statistics Takes as input the particle list and momenta from generator. Simulates detector response: multiple Coulomb scattering (generate scattering angle), particle decays (generate lifetime), ionization energy loss (generate  ), electromagnetic, hadronic showers, production of signals, electronics response,... Output = simulated raw data → input to reconstruction software: track finding, fitting, etc. Predict what you should see at ‘detector level’ given a certain hypothesis for ‘generator level’. Compare with the real data. Estimate ‘efficiencies’ = #events found / # events generated. Programming package: GEANT

12 For each reaction we consider we will have a hypothesis for the pdf of, e.g., Statistical tests (in a particle physics context) Glen CowanCERN Summer Student Lectures on Statistics Suppose the result of a measurement for an individual event is a collection of numbers x 1 = number of muons, x 2 = mean p t of jets, x 3 = missing energy,... follows some n-dimensional joint pdf, which depends on the type of event produced, i.e., was it etc. Often call H 0 the signal hypothesis (the event type we want); H 1, H 2,... are background hypotheses.

13 Selecting events Glen CowanCERN Summer Student Lectures on Statistics Suppose we have a data sample with two kinds of events, corresponding to hypotheses H 0 and H 1 and we want to select those of type H 0. Each event is a point in space. What ‘decision boundary’ should we use to accept/reject events as belonging to event type H 0 ? accept H0H0 H1H1 Perhaps select events with ‘cuts’:

14 Other ways to select events Glen CowanCERN Summer Student Lectures on Statistics Or maybe use some other sort of decision boundary: accept H0H0 H1H1 H0H0 H1H1 linearor nonlinear How can we do this in an ‘optimal’ way? What are the difficulties in a high-dimensional space?

15 Test statistics Glen CowanCERN Summer Student Lectures on Statistics Construct a ‘test statistic’ of lower dimension (e.g. scalar) We can work out the pdfs Try to compactify data without losing ability to discriminate between hypotheses. Decision boundary is now a single ‘cut’ on t. This effectively divides the sample space into two regions, where we accept or reject H 0.

16 Significance level and power of a test Glen CowanCERN Summer Student Lectures on Statistics Probability to reject H 0 if it is true (error of the 1st kind): (significance level) Probability to accept H 0 if H 1 is true (error of the 2nd kind): (  power)

17 Efficiency of event selection Glen CowanCERN Summer Student Lectures on Statistics Probability to accept an event which is signal (signal efficiency): Probability to accept an event which is background (background efficiency):

18 Purity of event selection Glen CowanCERN Summer Student Lectures on Statistics Suppose only one background type b; overall fractions of signal and background events are  s and  b (prior probabilities). Suppose we select events with t < t cut. What is the ‘purity’ of our selected sample? Here purity means the probability to be signal given that the event was accepted. Using Bayes’ theorem we find: So the purity depends on the prior probabilities as well as on the signal and background efficiencies.

19 Constructing a test statistic Glen CowanCERN Summer Student Lectures on Statistics How can we select events in an ‘optimal way’? Neyman-Pearson lemma (proof in Brandt Ch. 8) states: To get the lowest  b for a given  s (highest power for a given significance level), choose acceptance region such that where c is a constant which determines  s. Equivalently, optimal scalar test statistic is

20 Why Neyman-Pearson doesn’t always help Glen CowanCERN Summer Student Lectures on Statistics The problem is that we usually don’t have explicit formulae for the pdfs Instead we may have Monte Carlo models for signal and background processes, so we can produce simulated data, and enter each event into an n-dimensional histogram. Use e.g. M bins for each of the n dimensions, total of M n cells. But n is potentially large, → prohibitively large number of cells to populate with Monte Carlo data. Compromise: make Ansatz for form of test statistic with fewer parameters; determine them (e.g. using MC) to give best discrimination between signal and background.

21 Linear test statistic Glen CowanCERN Summer Student Lectures on Statistics Ansatz: → Fisher: maximize Choose the parameters a 1,..., a n so that the pdfs have maximum ‘separation’. We want: ss bb t g (t) bb large distance between mean values, small widths ss

22 Fisher discriminant Glen CowanCERN Summer Student Lectures on Statistics Using this definition of separation gives a Fisher discriminant. accept H0H0 H1H1 Corresponds to a linear decision boundary. Equivalent to Neyman-Pearson if the signal and background pdfs are multivariate Gaussian with equal covariances; otherwise not optimal, but still often a simple, practical solution.

23 Nonlinear test statistics Glen CowanCERN Summer Student Lectures on Statistics The optimal decision boundary may not be a hyperplane, → nonlinear test statistic accept H0H0 H1H1 Multivariate statistical methods are a Big Industry: Particle Physics can benefit from progress in Machine Learning. Neural Networks, Support Vector Machines, Kernel density methods,...

24 Neural network example from LEP II Glen CowanCERN Summer Student Lectures on Statistics Signal: e  e  → W  W  (often 4 well separated hadron jets) Background: e  e  → qqgg (4 less well separated hadron jets) ← input variables based on jet structure, event shape,... none by itself gives much separation. Neural network output does better... (Garrido, Juste and Martinez, ALEPH )

25 Wrapping up lecture 2 Glen CowanCERN Summer Student Lectures on Statistics We’ve seen the Monte Carlo method: calculations based on sequences of random numbers, used to simulate particle collisions, detector response. And we looked at statistical tests and related issues: discriminate between event types (hypotheses), determine selection efficiency, sample purity, etc. Some modern (and less modern) methods were mentioned: Fisher discriminants, neural networks, support vector machines,... In the next lecture we will talk about goodness-of-fit tests and then move on to another main subfield of statistical inference: parameter estimation.