Probability & Statistical Inference Lecture 2

Slides:



Advertisements
Similar presentations
Probability Unit 3.
Advertisements

Probability & Statistical Inference Lecture 3
Lecture (7) Random Variables and Distribution Functions.
Introduction to Probability and Statistics Chapter 5 Discrete Distributions.
1 Chapter 3 Probability 3.1 Terminology 3.2 Assign Probability 3.3 Compound Events 3.4 Conditional Probability 3.5 Rules of Computing Probabilities 3.6.
Note 6 of 5E Statistics with Economics and Business Applications Chapter 4 Useful Discrete Probability Distributions Binomial, Poisson and Hypergeometric.
Chapter 4 Probability and Probability Distributions
COUNTING AND PROBABILITY
© 2011 Pearson Education, Inc
ฟังก์ชั่นการแจกแจงความน่าจะเป็น แบบไม่ต่อเนื่อง Discrete Probability Distributions.
Chapter 3 Probability.
Probability Distributions Discrete. Discrete data Discrete data can only take exact values Examples: The number of cars passing a checkpoint in 30 minutes.
Probability & Statistical Inference Lecture 3
22C:19 Discrete Structures Discrete Probability Fall 2014 Sukumar Ghosh.
Introduction to Probability
Probability Distributions
Chapter 4 Probability Distributions
Lecture Slides Elementary Statistics Twelfth Edition
Class notes for ISE 201 San Jose State University
McGraw-Hill Ryerson Copyright © 2011 McGraw-Hill Ryerson Limited. Adapted by Peter Au, George Brown College.
Chapter 5 Several Discrete Distributions General Objectives: Discrete random variables are used in many practical applications. These random variables.
Probability and Probability Distributions
Probability Chapter 3. § 3.1 Basic Concepts of Probability.
Statistics Chapter 3: Probability.
Problem A newly married couple plans to have four children and would like to have three girls and a boy. What are the chances (probability) their desire.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 4 and 5 Probability and Discrete Random Variables.
1 CY1B2 Statistics Aims: To introduce basic statistics. Outcomes: To understand some fundamental concepts in statistics, and be able to apply some probability.
QA in Finance/ Ch 3 Probability in Finance Probability.
5.1 Basic Probability Ideas
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
PROBABILITY & STATISTICAL INFERENCE LECTURE 3 MSc in Computing (Data Analytics)
Slide 1 Copyright © 2004 Pearson Education, Inc..
Chapter 4 Probability Distributions
Theory of Probability Statistics for Business and Economics.
Nor Fashihah Mohd Noor Institut Matematik Kejuruteraan Universiti Malaysia Perlis ІМ ќ INSTITUT MATEMATIK K E J U R U T E R A A N U N I M A P.
Introduction to Probability and Statistics Thirteenth Edition Chapter 5 Several Useful Discrete Distributions.
Warm-Up 1. What is Benford’s Law?
MATB344 Applied Statistics Chapter 5 Several Useful Discrete Distributions.
 Review Homework Chapter 6: 1, 2, 3, 4, 13 Chapter 7 - 2, 5, 11  Probability  Control charts for attributes  Week 13 Assignment Read Chapter 10: “Reliability”
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
Introduction to Probability and Statistics Thirteenth Edition Chapter 5 Several Useful Discrete Distributions.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Computing Fundamentals 2 Lecture 6 Probability Lecturer: Patrick Browne
Probability Basic Concepts Start with the Monty Hall puzzle
Sixth lecture Concepts of Probabilities. Random Experiment Can be repeated (theoretically) an infinite number of times Has a well-defined set of possible.
QR 32 Section #6 November 03, 2008 TA: Victoria Liublinska
Chapter 5 Discrete Random Variables Probability Distributions
Lecture 6 Dustin Lueker.  Standardized measure of variation ◦ Idea  A standard deviation of 10 may indicate great variability or small variability,
Statistics.  Probability experiment: An action through which specific results (counts, measurements, or responses) are obtained.  Outcome: The result.
+ Chapter 5 Overview 5.1 Introducing Probability 5.2 Combining Events 5.3 Conditional Probability 5.4 Counting Methods 1.
Lecture 7 Dustin Lueker.  Experiment ◦ Any activity from which an outcome, measurement, or other such result is obtained  Random (or Chance) Experiment.
Week 21 Rules of Probability for all Corollary: The probability of the union of any two events A and B is Proof: … If then, Proof:
Welcome to MM305 Unit 3 Seminar Prof Greg Probability Concepts and Applications.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Computing Fundamentals 2 Lecture 7 Statistics, Random Variables, Expected Value. Lecturer: Patrick Browne
3-1 Copyright © 2014, 2011, and 2008 Pearson Education, Inc.
Chapter5 Statistical and probabilistic concepts, Implementation to Insurance Subjects of the Unit 1.Counting 2.Probability concepts 3.Random Variables.
1 What Is Probability?. 2 To discuss probability, let’s begin by defining some terms. An experiment is a process, such as tossing a coin, that gives definite.
Probability Distributions ( 확률분포 ) Chapter 5. 2 모든 가능한 ( 확률 ) 변수의 값에 대해 확률을 할당하는 체계 X 가 1, 2, …, 6 의 값을 가진다면 이 6 개 변수 값에 확률을 할당하는 함수 Definition.
Probability and Probability Distributions. Probability Concepts Probability: –We now assume the population parameters are known and calculate the chances.
Probability Distributions  A variable (A, B, x, y, etc.) can take any of a specified set of values.  When the value of a variable is the outcome of a.
ICS 253: Discrete Structures I
What Is Probability?.
What is Probability? Quantification of uncertainty.
ENGR 201: Statistics for Engineers
Probability Probability underlies statistical inference - the drawing of conclusions from a sample of data. If samples are drawn at random, their characteristics.
Introduction to Probability and Statistics
Lecture 11: Binomial and Poisson Distributions
Introduction to Probability and Statistics
Presentation transcript:

Probability & Statistical Inference Lecture 2 MSc in Computing (Data Analytics)

Lecture Outline Introduction Introduction to Probability Theory Discrete Probability Distributions Question Time

Introduction

Probability & Statistics Population Representative Sample Sample Statistic We want to make decisions based on evidence from a sample i.e. extrapolate from sample evidence to a general population To make such decisions we need to be able to quantify our (un)certainty about how good or bad our sample information is. Make Inference Describe

Probability & Statistics - Example Example: How many voters will give F.F. a first preference in the next general election ? researcher A takes a sample of size 10 and find 4 people who say they will researcher B takes a sample of size 100 and find 25 people who say they will Researcher A => 40% Researcher B => 25% Who would you believe?

Probability & Statistics - Example Intuitively the bigger sample would get more credence but how much better is it, and are either of the samples any good? Probability helps Descriptive Statistics are helpful but still lead to decision making by 'intuition‘ Probability helps to quantify (un)certainty which is a more powerful aid to the decision maker

Probability & Statistics Using probability theory we can measure the amount of uncertainty/certainty in our statistics.

Intuitions and Probability – Lotto example If you had an Irish lotto ticket which of these sets of numbers is more likely to win: 1 2 3 4 5 6 Odds of winning are 1 in 8145060 2 11 26 27 35 42

Intuitions and Probability – Disease example Suppose we have a diagnostic test for a disease which is 99% accurate. A person is picked at random and tested for the disease The test gives a positive result. What is the probability that the person actually has the disease? 99% ?

Disease example No!! IT depends on how common or rare the disease is. Suppose the disease affects 1 person in 10,000 Test Results Those that don’t have/do have the disease If you take a population of 1,000,000 1,000,000 999,900 989,991 9,999 100 99 1 Of those who test positive only have the disease

Introduction to Probability Theory

Some Definitions An experiment that can result in different outcomes, even though it is repeated in the same manner every time, is called a random experiment. The set of all possible outcomes of a random experiment is called the sample space of an experiment and is denote by S Example: Experiment: Toss two coins and observe the up face on each Sample Space: Observe HH Observe HT Observe TH Observe TT S : {HH,HT,TH,TT}

Some Definitions A sample space is discrete if it consists of a finite or countable infinite set if outcomes. A sample space is continuous if it contains an interval or real numbers. An event is a subset of the sample space of a random experiment.

Counting A permutation of the elements is an ordered sequence of the elements. Example: S : {a,b,c} All the permutations of the elements of S are abc, acb, bca, bac, cba & cab. The number of permutations of n different elements is n! Where: n! = n * (n-1) * (n-2) * .......* 2 * 1 Above n=3 => 3! = 3 * 2 * 1 = 6

Counting The number of permutations of subsets r elements selected from a set of n different elements is Where order is not important when selecting r elements from a set of n different elements is called a combination:

Probability Whenever a sample space consists of N Possible outcomes that are equally likely, the probability of the outcome 1/N. For a discrete sample space, the probability of an event E, denoted by P(E), equals the sum of the probabilities of the outcome in E. Some rules for probabilities: For a given sample spce containing n event sE1, E2, E3, ........,En All simple event probabilities must lie between 0 and 1: 0 <= P(Ei) <= 1 for i=1,2,........,n The sum of the probabilities of all the simple events within a sample space must be equal to 1:

Sample Space: S : {HH,HT,TH,TT} Probability – Example 1 Example: Experiment: Toss two coins and observe the up face on each Sample Space: S : {HH,HT,TH,TT} Probability of each event: E = HH => P(HH) = 1/4 E = HT => P(HT) = 1/4 E = TH => P(TH) = 1/4 E = TT => P(HH) = 1/4

Probability – Example 1 The probability of an event A is equal to the sum of all the probabilities in event A: Example: Experiment: Toss two coins and observe the up face on each Event A: {Observe exactly one head} P(A) = P(HT) + P(TH) = ¼ + ¼ = ½ Event B : {Observe at least one head} P(B) = P(HH) + P(HT) + P(TH) = ¼ + ¼ + ¼ = ¾

Probability – Example 2 5 card poker example. It doesn't matter in poker in what order the cards are dealt, merely the combination of certain cards wins. So, how many different hands of poker are there? = ? What is the probability of being dealt a royal straight flush? What is the probability of being dealt a poker (i.e. 4 of a kind)?

Compound Events The union of two event A and B is the event that occurs if either A or B, or both, occur on a single performance of the experiment denoted by A U B (A or B) The intersection of two events A and B is the event that occurs if both A and B occur on a single performance of an experiment denoted by A B or (A and B) Example: Consider a die tossing experiment with equally likely simple events {1,2,3,4,5,6}. Define the events A, B and C. A:{Toss an even number} = {2,4,6} B:{Toss a less than or equal to 3} = {1,2,3} C:{Toss a number greater than 1} = {2,3,4,5,6} Find:

Complementary Event The complementary of an event A is the event that A does not occur denoted by A´ Note that AU A` = S, the sample space P(A) + P(A`) =1 => P(A) = 1 – P(A`)

Questions What is the sample space when a coin is tossed 3 times? What is the probability of tossing all heads or all tails. What is the sample space of throwing a fair die. If a fair die is thrown what is the probability of throwing a prime number (2,3,5)?

Questions A factory has two assembly lines, each of which is shut down (S), at partial capacity (P), or at full capacity (F). The following table gives the sample space For where (S,P) denotes that the first assembly line is shut down and the second one is operating at partial capacity. What is the probability that: Both assembly lines are shut down? Neither assembly lines are shut down At least one assembly line is on full capacity Exactly one assembly line is at full capacity Event A P(A) (S,S) 0.02 (S,P) 0.06 (S,F) 0.05 (P,S) 0.07 (P,P) 0.14 (P,F) 0.2 (F,S) (F,P) 0.21 (F,F) 0.19

Conditional Probability The conditional probability of event A conditional on event B is for P(B)>0. It measures the probability that event A occurs when it is known that event B occur. Example: A = odd result on die = {1,3,5} B = result > 3 = {4,5,6}

Conditional Probability Example Example: A study was carried out to investigate the link between people’s lifestyles and cancer. One of the areas looked at was the link between lung cancer and smoking. 10,000 people over the age of 55 were studied over a 10 year period. In that time 277 developed lung cancer. What is the likelihood of somebody developing lung cancer given that they smoke? Cancer No Cancer Total Smoker 241 3,325 3,566 Non-Smoker 36 6,398 6,434 277 9,723 10,000

Conditional Probability Example Event A: A person develops lung cancer Event B: A person is a smoker P(A) = 277/10,000 = 0.027 P(B) = 3,566/10,000 = 0.356

Exercises A ball is chosen at random from a bag containing 150 balls that are either red or blue and either dull or shinny. There are 36 red, shiny balls and 54 blue balls. There are 72 dull balls. What is the probability of a chosen ball being shiny conditional on it being red? What is the probability of a chosen ball being dull conditional on it being blue?

Mutually Exclusive Events Two events, A and B, are mutually exclusive given that if A happens then B can’t also happen. Example: Roll of a die A = less than 2 B = even result There is no way that A and B can happen at the same time therefore they are mutually exclusive events

Rules for Unions Additive Rule: Additive Rule for Mutually Exclusive Events

Example Records at an industrial plant show that 12% of all injured workers are admitted to hospital for treatment, 16% are back on the job the next day, and 2% are both admitted to a hospital for treatment and back to work the next day. If a worker is injured what is the probability that the worker will be either admitted to hospital or back on the job the next day or both?

Independent Events Events A and B are independent if it is the case that A happening does not alter the probability that B happens. Example : A = even result on die B = result > 2 Then, let us say we are told the result on the die (which someone has observed but not us) is even so knowing this, what is the probability that the event B has happened? Sample space: {2, 4, 6} B = 4 or 6 => P(B) = 2/3

Independent Events But if we didn’t know about the even result we would get: Sample space: {1, 2, 3, 4, 5, 6} B = 3 or 4 or 5 or 6 => P(B) = 4/6 = 2/3 so knowledge about event A has in no way changed out probability assessment concerning event B

Rules for Intersection Multiplicative Rule of Probability If events A and B are independent then

Bayes Theorem One of a number of very useful results: - here is simplest definition: Suppose: You have two events which are ME and exhaustive – i.e. account for all the sample space – Call these events A and event (read ‘not A’). Further suppose there is another event B, such that P(B|A) > 0 and P(A|B) > 0. Then Bayes theorem states:

Discrete Probability Distributions

Discrete Random Variable A Random Variable (RV) is obtained by assigning a numerical value to each outcome of a particular experiment. Probability Distribution: A table or formula that specifies the probability of each possible value for the Discrete Random Variable (DRV) DRV: a RV that takes a whole number value only

Example: What is the probability distribution for the experiment to assess the no of tails from tossing 2 coins; Sample Space Coin 1 Coin 2 T T T H H T H H x = no. of tails is the RV x P(x) 0 = P(HH) = 0.25 1 = P(TH) + P(HT) = 0.50 2 = P(TT) = 0.25 P( any other value ) = 0 N.B.  P(x) = 1 0  P(x)  1 for all values of x

Mean of a Discrete Random Variable Mean of a DRV =  = Σ x * p(x) Example: Throw a fair die x P(x) x * P(x) 1 0.1667 0.17 2 0.1667 0.33 3 0.1667 0.50 4 0.1667 0.67 5 0.1667 0.83 6 0.1667 1.00 P(any other value) = 0 0 Mean =  = Σ x * p(x) = 3.5

Some Examples Simulated Sample size = 10 3 1 4 6 mean = 3.7 S.D. = 2.1 3 1 4 6 Simulated Sample size = 100 4 6 3 2 1 5 mean = 3.54 S.D. = 1.67

Simulated Sample size = 1000 Mean = 3.49 S.D. = 1.73 1 3 2 6 4 5

Note: The largest simulation had the mean closest to that predicted by the probability distribution As the simulations got bigger the mean approached 3.5 Mean of DRV is the mean of a large number of independent experiments (trials).

Standard Deviation of a DRV

Example: Rolling one die x P(x) x2 * P(x) 1 0.1667 0.17 2 0.1667 0.67 3 0.1667 1.50 4 0.1667 2.67 5 0.1667 4.17 6 0.1667 6.00 P(any other value) = 0 0 = 15.17 15.17 - (3.5)2 = 15.17 - 12.25 = 2.92 => S.D. = 1.71 Simulations: N=10 => S.D. = 2.1 N=100 => S.D. = 1.67 N=1,000 => S.D. = 1.73

Binomial (Probability) Distribution Many experiments lead to dichotomous responses (i.e. either success/failure, yes/no etc.) Often a number of independent trials make up the experiment Example: number of people in a survey who agree with a particular statement? Survey 100 people => 100 independent trials of Yes/No The random variable of interest is the no. of successes (however defined) These are Binomial Random Variables

Binomial Distribution Example 4 people tested for the presence of a particular gene. success = presence of gene P(gene present / success) = 0.55 P(gene absent / failure) = 0.45 P(3 randomly tested people from 4 have gene)? Assume trials are independent - e.g. the people are not related There is 4 ways of getting 3 successes Outcome 1 Outcome2 Outcome 3 Outcome 4 Y N

Binomial Distribution Example Using Independence rule we can calculate the probability of each outcome: Outcome 1: 0.55  0.55  0.55  0.45 = 0.07486875 Outcome 2: 0.55  0.55  0.45  0.55 = 0.07486875 Outcome 3: 0.55  0.45  0.55  0.55 = 0.07486875 Outcome 4: 0.45  0.55  0.55  0.55 = 0.07486875 4 ways of getting result each with P=0.07486875 => 4  0.07486875 = 0.299475 => P(3 randomly tested people have gene) = 0.299475

Binomial Distribution Example A more convenient way of mathematically writing the same result is as follows: the number of ways you can get three successes from 4 trials is a combination:

Binomial Distribution – General Formula This all leads to a very general rule for calculating binomial probabilities: In General Binomial (n,p) n = no. of trials p = probability of a success x = RV (no. of successes) Where P(X=x) is read as the probability of seeing x successes.

Binomial Distribution For all binomials the mean is given by the simple formula;  = n  p Example: from previous example  = 4  0.55 = 2.2 Standard deviation also has simple formula for all Binomials Example: from previous example = 0.995

Binomial Distribution What is P(< 3 people have gene) from a group of four people tested at random? Use the fact that the possible outcome are mutually exclusive (ME) = P(0) + P(1) + P(2) = 0.041 + 0.2 + 0.368 = 0.609 [ to 3 decimal places ] We can write this probability like this; P(X>3)=?

Binomial Question There are two hospitals in a town. In Hospital A, 10 babies are born each day, in Hospital B there are 30 babies born each day. If the hospitals only count those days on which over 70% of babies born are girls, and assuming the probability that a girl is born is ½, which of the two hospitals will record more such days? Hospital A: Binomial (n=10, p=0.5) Hospital B: Binomial (n=30, p=0.5)

Answer Hospital 1: Hospital 2 : Calculate : There is a higher probability of getting 70% of babies born being girl from hospital 1.

Binomial Question A flu virus hits a company employing 180 people. Independent of other employees , there is a probability p=0.35 that each person needs to take sick leave. What is the expectation and variance of the proportion of the workforce who needs to take sick leave. In general what is the value of the sick rate p that produces the largest variance for this proportion.

Poisson Probability Many experiments don't have a simple success/failure response Responses can be the number of events occurring over time, area, volume etc. We don't know the number of 'failures' just the number of successes. Example: The number of calls to a telesales company - we know how many calls got through (successes) - but don't know how many failed (lines busy etc.) Knowledge of the mean number of events over time etc => Poisson Random Variable Events must occur randomly

Poisson Probability Distribution Probability Distribution for Poisson Where  is the known mean: x is the value of the RV with possible values 0,1,2,3,…. e = irrational constant (like ) with value 2.71828… The standard deviation , , is given by the simple relationship;  =

Example: Bombing of London WW2 1944 German V1 rockets feel on London Were they aimed at specific targets or falling randomly? Important in AA strategy & Civil Defence Divide London into a 24  24 grid of equal sizes (576 equal square areas).

Example: Bombing of London WW2 If rockets are random => should fall according to Poisson random variable per square  (mean) = No. of Bombs/ No of squares = 535/576 = 0.9288 So, for a particular square (assuming randomness) Where x is the number of bombs landing in the square on the map grid.

Example: Bombing of London WW2 Prediction from Poisson so good => British concluded rockets were not being aimed at specific targets - were falling randomly on London X = no. of rockets P(x) 576  p(x) 0.395 228 1 0.367 211 2 0.170 98 3 0.053 31 4 0.012 7 > 4 (i.e. 5+) 0.003 Actual no. of squares Hit 229 211 93 35 7 1

Other Basic Discrete Probability Distributions Geometric – No. of independent trials to first success. Negative Binomial - No. of independent trials to first, second, third fourth… success. Hypergeometric – lottery type experiments. many others….

Question The number of cracks in a ceramic tile has a Poisson distribution with a mean µ = 2.4. What is the probability that a tile has no cracks? What is the probability that a tile has four or more cracks?

Question Time

Exercise: There is more that one way to skin a cat! If two fair die are thrown what is the probability that at least one score is a prime number (2, 3, 5)? What is the compliment of the event? What is its probability? There are three ways (at least) that we can approach this problem

Exercise Consider a multiple choice test in which: Each question has 4 possible answers of which only 1 is correct The test is made up of 10 questions The pass mark is 40% How well could we do if we just guessed each answer? What is the probability of guessing a single question correctly? What is the probability of getting no answers correct in the test? What is the probability of getting at least one question correct in the test? What is the probability of getting a score of 40% in the test? What is the probability of passing the test?

Next Week A quick Recap Solutions to question Continuous Distributions.