Presentation is loading. Please wait.

Presentation is loading. Please wait.

Probability. Probability Probability is fundamental to scientific inference Probability is fundamental to scientific inference Deterministic vs. Probabilistic.

Similar presentations


Presentation on theme: "Probability. Probability Probability is fundamental to scientific inference Probability is fundamental to scientific inference Deterministic vs. Probabilistic."— Presentation transcript:

1 Probability

2 Probability Probability is fundamental to scientific inference Probability is fundamental to scientific inference Deterministic vs. Probabilistic systems Deterministic vs. Probabilistic systems When the knowledge of a system is incomplete, the behavior of the system is not perfectly predictable When the knowledge of a system is incomplete, the behavior of the system is not perfectly predictable Events appear to occur randomly following a probability structure Events appear to occur randomly following a probability structure

3 Definitions Sample Space - S Sample Space - S Set of all possible events Set of all possible events Coin flip (H,T) Coin flip (H,T) Roll a die (1,2,3,4,5,6) Roll a die (1,2,3,4,5,6) Exam score (1,2,3,…,100) Exam score (1,2,3,…,100) Trial Trial One sample or experiment yielding an observation One sample or experiment yielding an observation Coin flip, dice roll, one exam score Coin flip, dice roll, one exam score

4 Definitions (cont) Operations Operations A and B are two events in S that could occur on any given trial A and B are two events in S that could occur on any given trial ~A (Not A) -> A does not occur ~A (Not A) -> A does not occur A  B (union) -> Either A or B occurs A  B (union) -> Either A or B occurs A,B (intersection) -> Both A and B occur A,B (intersection) -> Both A and B occur A\B (=A,~B) -> A occurs and B does not A\B (=A,~B) -> A occurs and B does not

5 Definitions (cont) Exhaustive events Exhaustive events A  B = S A  B = S Ex: Coin flip… H  T = S Ex: Coin flip… H  T = S Exclusive events Exclusive events A,B=  A,B=  Ex: Heads and Tails can never occur together Ex: Heads and Tails can never occur together p(A) ≥ 0 p(A) ≥ 0 p(S) = 1 p(S) = 1 p(A  B) = p(A)+ p(B) - p(A,B) p(A  B) = p(A)+ p(B) - p(A,B)

6 Conditional Probability Probability that an event (B) will occur given that another event has already occurred (A) Probability that an event (B) will occur given that another event has already occurred (A) Ex: Probability of lung cancer (unconditional) Ex: Probability of lung cancer (unconditional) Ex: Probability of lung cancer given that you are a smoker (conditional) Ex: Probability of lung cancer given that you are a smoker (conditional) p(B|A) = p(B,A) / p(A) p(B|A) = p(B,A) / p(A)

7 Conditional Probability Example: - Deck of cards Example: - Deck of cards 52 cards, 4 suites with 13 cards in each suite 52 cards, 4 suites with 13 cards in each suite p(ace) = 4/52 (unconditional) p(ace) = 4/52 (unconditional) p(B-ace|A-ace) - Assume no replacement p(B-ace|A-ace) - Assume no replacement p(B,A) = 12 / 2652 p(B,A) = 12 / 2652 possible pairs of aces / all possible pairs possible pairs of aces / all possible pairs p(B|A) = (12/2652)/(4/52) = 3/51 p(B|A) = (12/2652)/(4/52) = 3/51

8 Joint Probability Intersection of two events - p(A,B) Intersection of two events - p(A,B) Probability of A and B Probability of A and B Can be viewed as a function of a conditional and an unconditional probability Can be viewed as a function of a conditional and an unconditional probability p(A,B) = p(B|A) p(A) p(A,B) = p(B|A) p(A) probability of A times the probability of B given that A has occurred probability of A times the probability of B given that A has occurred

9 Joint Probability Multiplication Rule Multiplication Rule Generalizes joint probability to more than 2 events Generalizes joint probability to more than 2 events p(A 1,A 2,A 3 …A n ) = p(A 1 )*p(A 2 |A 1 )*p(A 3 |A 1,A 2 )*p(A n |A 1,A 2,…,A n-1 ) p(A 1,A 2,A 3 …A n ) = p(A 1 )*p(A 2 |A 1 )*p(A 3 |A 1,A 2 )*p(A n |A 1,A 2,…,A n-1 )

10 Bayes theorem Conceptually challenging but mathematically simple Conceptually challenging but mathematically simple Frequentist vs subjective probability views of probability Frequentist vs subjective probability views of probability # of expected occurrences for a set of trials # of expected occurrences for a set of trials Degree of belief Degree of belief Bayes Rule may be viewed as a way to update belief or subjective probability, p(A), as evidence comes in, p(A|B) Bayes Rule may be viewed as a way to update belief or subjective probability, p(A), as evidence comes in, p(A|B)

11 Bayes Theorem p(A|B) = p(A,B)/p(B) p(A|B) = p(A,B)/p(B) or = p(B|A)p(A)/p(B) or = p(B|A)p(A)/p(B) or = p(B|A)p(A)/(p(B|A)p(A)+p(B|~A)p(~A)) or = p(B|A)p(A)/(p(B|A)p(A)+p(B|~A)p(~A)) p(A) = Prior Belief p(A) = Prior Belief P(A|B) = Posterior belief P(A|B) = Posterior belief

12 Bayes Rule - Example Morris code Morris code Dots and dashes occur in proportion of 3:4 Dots and dashes occur in proportion of 3:4 Sometimes sent dots (D) aren’t received as dots Sometimes sent dots (D) aren’t received as dots Assume p(error) = 1/8 Assume p(error) = 1/8 D = a dot was sent D = a dot was sent R = a dot was received R = a dot was received p(D) = 3/7; Probability a dot is sent p(D) = 3/7; Probability a dot is sent 3 times out of 7 a dot is sent 3 times out of 7 a dot is sent Question: If a dot is received, what is the probability that a dot was sent? Question: If a dot is received, what is the probability that a dot was sent?

13 Bayes Rule - Example p(D|R)? p(D|R)? p(D|R) = p(R|D)p(D)/p(R) p(D|R) = p(R|D)p(D)/p(R) p(D) = 3/7 p(D) = 3/7 p(R|D) = 7/8 p(R|D) = 7/8 p(R) = ? p(R) = ? p(R) = p(R|D)p(D)+p(R|~D)p(~D) p(R) = p(R|D)p(D)+p(R|~D)p(~D) p(R) = (7/8)(3/7) + (1/8)(4/7) = 25/56 p(R) = (7/8)(3/7) + (1/8)(4/7) = 25/56 p(D|R) = (7/8)(3/7)/(25/56)=(21/25)=.84 p(D|R) = (7/8)(3/7)/(25/56)=(21/25)=.84

14 Independence Following Bayes Rule, two events are independent if the occurrence of event B doesn’t alter the probability of event A Following Bayes Rule, two events are independent if the occurrence of event B doesn’t alter the probability of event A p(A) = p(A|B) p(A) = p(A|B) p(A,B) = p(A) p(B) p(A,B) = p(A) p(B)

15 Conditional Independence 2 events, A and B, are independent given C if: 2 events, A and B, are independent given C if: p(A,B|C) = p(A|C) p(B|C) p(A,B|C) = p(A|C) p(B|C)

16 Discrete Random Variables A variable that take on a finite number of states with a given probability structure A variable that take on a finite number of states with a given probability structure Ex: faces of a die Ex: faces of a die Our focus is on discrete random variables with two states Our focus is on discrete random variables with two states Ex: right/wrong or agree/disagree Ex: right/wrong or agree/disagree

17 Discrete Random Variables Bernoulli trial Bernoulli trial An observation where a random variable may take on only one of two states (yes/no) An observation where a random variable may take on only one of two states (yes/no) Probability Mass Function Probability Mass Function xixixixi P(X = x i ) 0 1-p 1p

18 Discrete Random Variables Binomial Distribution Binomial Distribution X i is a binary random variable representing n independent Bernoulli trials each having the same probability (p) of observing a success X i is a binary random variable representing n independent Bernoulli trials each having the same probability (p) of observing a success Gives the probability of x successes in n trials Gives the probability of x successes in n trials

19 Discrete Random Variables Binomial Probability Mass Function Ex: n = 3 ciciciciX P(X = x) 0, 0, 0 0 (1-p) 3 0, 0, 1 1 p(1-p) 2 0, 1, 0 1 p(1-p) 2 1, 0, 0 1 p(1-p) 2 0, 1, 1 2 p 2 (1-p) 1, 0, 1 2 p 2 (1-p) 1, 1, 0 2 p 2 (1-p) 1, 1, 1 3 p3p3p3p3

20 Discrete Random Variables   X~Bin(n,p)   n & p specify the parameters of a specific binomial distribution   Cumulative Distribution Function (c.d.f.)   F X (x) = P(X ≤ x)=Σp j x F X (x) (-∞,0)0 [0,1) (1-p) 3 [1,2) (1-p) 3 + 3p(1-p) 2 [2,3) (1-p) 3 + 3p(1-p) 2 + 3p 2 (1-p) [3, ∞) 1

21 Discrete Random Variables Graphic cdf

22 Continuous Random Variables  Normal distribution- Probability Density Function   X~N(μ,σ)

23 Normal Distribution Cumulative Distribution Function Cumulative Distribution Function

24 Normal Probability in R Sampling from a given distribution Sampling from a given distribution X<-rnorm(400,mean=3,sd=3) X<-rnorm(400,mean=3,sd=3) hist(x) hist(x) Determining the cdf up to a given value Determining the cdf up to a given value pnorm(0) pnorm(0) pnorm(1.2) pnorm(1.2) x <- seq(-4,4,by=.1) x <- seq(-4,4,by=.1) y <- pnorm(x) y <- pnorm(x) plot(x,y) plot(x,y)

25 Binomial Probability in R Distribution function Distribution function > x x <- seq(0,50,by=1) > y y <- dbinom(x,50,0.2) > plot(x,y) > plot(x,y) > y y <- dbinom(x,50,0.6) > plot(x,y) > plot(x,y) > x x <- seq(0,100,by=1) > y y <- dbinom(x,100,0.6) > plot(x,y) > plot(x,y)

26 Binomial Probability in R Cumulative Probability Distribution Cumulative Probability Distribution > pbinom(24,50,0.5) [1] 0.4438624 > pbinom(24,50,0.5) [1] 0.4438624 > pbinom(25,50,0.5) [1] 0.5561376 > pbinom(25,50,0.5) [1] 0.5561376 > pbinom(25,51,0.5) [1] 0.5 > pbinom(25,51,0.5) [1] 0.5 > pbinom(26,51,0.5) [1] 0.610116 > pbinom(26,51,0.5) [1] 0.610116 > pbinom(25,50,0.5) [1] 0.5561376 > pbinom(25,50,0.5) [1] 0.5561376 > pbinom(25,50,0.25) [1] 0.999962 > pbinom(25,50,0.25) [1] 0.999962 > pbinom(25,500,0.25) [1] 4.955658e-33 > pbinom(25,500,0.25) [1] 4.955658e-33

27 Binomial Probability in R Sampling Sampling > rbinom(5,100,.2) > rbinom(5,100,.2) [1] 30 23 21 19 18 [1] 30 23 21 19 18 > rbinom(5,100,.7) > rbinom(5,100,.7) [1] 66 66 58 68 63 > [1] 66 66 58 68 63 >


Download ppt "Probability. Probability Probability is fundamental to scientific inference Probability is fundamental to scientific inference Deterministic vs. Probabilistic."

Similar presentations


Ads by Google