Computing Fundamentals 2 Lecture 6 Probability Lecturer: Patrick Browne

Slides:



Advertisements
Similar presentations
Chapter 2 Concepts of Prob. Theory
Advertisements

Chapter 4 Probability: Probabilities of Compound Events
Chapter 2 Probability. 2.1 Sample Spaces and Events.
0 0 Review Probability Axioms –Non-negativity P(A)≥0 –Additivity P(A U B) =P(A)+ P(B), if A and B are disjoint. –Normalization P(Ω)=1 Independence of two.
Chapter 4 Probability and Probability Distributions
COUNTING AND PROBABILITY
Sets: Reminder Set S – sample space - includes all possible outcomes
Chapter 6: Probability : The Study of Randomness “We figured the odds as best we could, and then we rolled the dice.” US President Jimmy Carter June 10,
MA-250 Probability and Statistics Nazar Khan PUCIT Lecture 13.
MAT 103 Probability In this chapter, we will study the topic of probability which is used in many different areas including insurance, science, marketing,
1 1 PRESENTED BY E. G. GASCON Introduction to Probability Section 7.3, 7.4, 7.5.
Chapter 4 Using Probability and Probability Distributions
Probability and Statistics Dr. Saeid Moloudzadeh Sample Space and Events 1 Contents Descriptive Statistics Axioms of Probability Combinatorial.
4.2 Probability Models. We call a phenomenon random if individual outcomes are uncertain but there is nonetheless a regular distribution of outcomes in.
Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 4-1 Business Statistics: A Decision-Making Approach 7 th Edition Chapter.
Review of Probability Theory. © Tallal Elshabrawy 2 Review of Probability Theory Experiments, Sample Spaces and Events Axioms of Probability Conditional.
Lecture 7 Dustin Lueker.  Experiment ◦ Any activity from which an outcome, measurement, or other such result is obtained  Random (or Chance) Experiment.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
5.1 Basic Probability Ideas
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Independence and Bernoulli.
Chapter 8 Probability Section R Review. 2 Barnett/Ziegler/Byleen Finite Mathematics 12e Review for Chapter 8 Important Terms, Symbols, Concepts  8.1.
1 Probability. 2 Today’s plan Probability Notations Laws of probability.
Chapter 1 Probability Spaces 主講人 : 虞台文. Content Sample Spaces and Events Event Operations Probability Spaces Conditional Probabilities Independence of.
Warm-Up 1. What is Benford’s Law?
Section 7.2. Section Summary Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability Independence Bernoulli.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review Instructor: Anirban Mahanti Office: ICT Class.
3. Counting Permutations Combinations Pigeonhole principle Elements of Probability Recurrence Relations.
LECTURE 15 THURSDAY, 15 OCTOBER STA 291 Fall
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Independence and Bernoulli Trials. Sharif University of Technology 2 Independence  A, B independent implies: are also independent. Proof for independence.
1 CHAPTERS 14 AND 15 (Intro Stats – 3 edition) PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY.
Dr. Ahmed Abdelwahab Introduction for EE420. Probability Theory Probability theory is rooted in phenomena that can be modeled by an experiment with an.
PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY
1 CHAPTER 7 PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY.
Probability You’ll probably like it!. Probability Definitions Probability assignment Complement, union, intersection of events Conditional probability.
Chapter Probability 9 9 Copyright © 2013, 2010, and 2007, Pearson Education, Inc.
Sixth lecture Concepts of Probabilities. Random Experiment Can be repeated (theoretically) an infinite number of times Has a well-defined set of possible.
© 2010 Pearson Education, Inc. All rights reserved Chapter 9 9 Probability.
QR 32 Section #6 November 03, 2008 TA: Victoria Liublinska
Natural Language Processing Giuseppe Attardi Introduction to Probability IP notice: some slides from: Dan Jurafsky, Jim Martin, Sandiway Fong, Dan Klein.
Probability A quantitative measure of uncertainty A quantitative measure of uncertainty A measure of degree of belief in a particular statement or problem.
Independent Events Lesson Starter State in writing whether each of these pairs of events are disjoint. Justify your answer. If the events.
+ Chapter 5 Overview 5.1 Introducing Probability 5.2 Combining Events 5.3 Conditional Probability 5.4 Counting Methods 1.
Lecture 7 Dustin Lueker.  Experiment ◦ Any activity from which an outcome, measurement, or other such result is obtained  Random (or Chance) Experiment.
§2 Frequency and probability 2.1The definitions and properties of frequency and properties.
Week 21 Rules of Probability for all Corollary: The probability of the union of any two events A and B is Proof: … If then, Proof:
Probability. Randomness When we produce data by randomized procedures, the laws of probability answer the question, “What would happen if we did this.
PROBABILITY AND BAYES THEOREM 1. 2 POPULATION SAMPLE PROBABILITY STATISTICAL INFERENCE.
3.4 Elements of Probability. Probability helps us to figure out the liklihood of something happening. The “something happening” is called and event. The.
PROBABILITY 1. Basic Terminology 2 Probability 3  Probability is the numerical measure of the likelihood that an event will occur  The probability.
6.2 – Probability Models It is often important and necessary to provide a mathematical description or model for randomness.
Probability What is the probability of rolling “snake eyes” in one roll? What is the probability of rolling “yahtzee” in one roll?
Virtual University of Pakistan
PROBABILITY Probability Concepts
Chapter 4 Probability Concepts
PROBABILITY AND PROBABILITY RULES
What is Probability? Quantification of uncertainty.
Copyright © 2016, 2013, and 2010, Pearson Education, Inc.
Definitions: Random Phenomenon:
Basic Probability aft A RAJASEKHAR YADAV.
Natural Language Processing
Introduction to Probability
Probability—outline:
STA 291 Spring 2008 Lecture 7 Dustin Lueker.
Probability Models Section 6.2.
CSCI 5832 Natural Language Processing
Chapter 1 Probability Spaces
Presentation transcript:

Computing Fundamentals 2 Lecture 6 Probability Lecturer: Patrick Browne

Probability If a die is thrown we consider it certain that it will land, with a random chance that it will show a 6. With s successes out of n experiments f=s/n is called the relative frequency of success. It becomes stable in the long run. It is this long term stability (limit) that forms the basis of probability.

Sample Space and Events The sample space S is the set of all possible outcomes of a given experiment. An element or outcome in S is called a sample point (or sample). An event A is a set of outcomes, it is a subset of the sample space. The singleton {a} where a  S is called an elementary event. The empty set, , sometimes represents an impossible event.

Sample Space and Events An event gives rise to a set hence we can use set operations to combine events. A  B is the event that occurs whenever A occurs or B occurs (or both) A  B is the event that occurs whenever A and B both occur. A c is the event that occurs whenever A does not occur (called the complement of A) Two events are mutually exclusive if they are disjoint: A  B = .

Sample Space and Events Toss a die and observe the top number S={1,2,3,4,5,6} A even number event, B odd number event, C prime number event. A={2,4,6} B={1,3,5} C={2,3,5} A  C ={2,3,4,5,6} B  C = {3,5} C c = {1,4,6} A and B are mutually exclusive.

Sample Space and Events Toss 3 coins and observe the H & T sequence S={HHH,HHT,HTH,HTT,THH,THT,TTH,TTT} Let A be the two consecutive heads event, B same outcome event. A={HHH,HHT,THH} B={HHH,TTT} A  B = {HHH} is the elementary event with only heads.

Probability Spaces A probability space is a triple ( S, A, P ), where: S sample space, all possible outcomes. A event space, sample events/outcomes P is a probability measure. We also have a set of probability axioms e.g. the probability of an event is a non- negative real number.

Probability Spaces A probability space consists of a sample space together with a positive, additive measure, called a probability measure, which sums to one; the points of the sample space represent the different possible outcomes of the phenomenon, and the probability measure assigns probabilities to sets of outcomes.

Finite Probability Spaces Let S be a finite sample space S={a 1,a 2,a 3...a n }. A finite probability space is obtained by assigning to each sample point a i  S a real number p i, called the probability of a i satisfying the following conditions: –Each p i is non-negative. –The sum of p i is one. We write P(A) for the sum of the probabilities sample points in A.

Finite Probability Spaces Three runners A, B, C ; A is twice a likely to win as B, and B is twice as likely to win as C. What is P(A),P(B),P(C) winning? Let P(C) = p P(B) = 2p P(A)=4p p+2p+4p=1 therefore p = 1/7 P(A)=4/7, P(B)=2/7, P(C)=1/7 P({B,C}) = P(B)+P(C)=3/7

Equiprobable Spaces If all the sample points within a given finite probability space are equal to each other, then it is known as an equiprobable space. An example would be a fair die, where each number is equally possible P(1) = P(2) = P(3) = P(4) =P(5) = P(6) = 1/6

Equiprobable Spaces If S contains n points, then the probability of each point is 1/n. If an event A contains r points then its probability is: r  1/n = r/n P(A) = number of elements in A number of elements in S

Equiprobable Spaces S = cards in the deck = 52 A = card is spade B = card is a face P(A) = 13/52 P(B) = 12/52 P(A  B) = 3/52

Axioms of Finite Probability Spaces 1.For every event A, 0  P(A)  1 2. P(S)=1, where S is sample space, 3.If events A and B are mutually exclusive (or disjoint), then P(A  B) = P(A) + P(B)

Theorems of Finite Probability Spaces 1.P(  ) = 0 2.P(A c )= 1 – P(A) 3.P(A\B)=P(A) - P(A  B) 4.A  B implies P(A)  P(B) 5.P(A)  1 6.P(A  B) = P(A) + P(B) - P(A  B) 7.P(A  B) = P(A) × P(B|A) Where P(B|A) reads the probability B given A

Addition P(A  B) = P(A) + P(B) - P(A  B) Sums are used when we have two events, and we want to know the probability that either event occurs (Event A union Event B). In the Addition Rule, A and B may or may not be disjoint. Mutually exclusive or disjoint events cannot occur together, so we have: P(A ⋂ B) = 0. Then the addition rule reduces to: P(A U B) = P(A) + P(B)

Addition Rule Example Suppose a student is selected at random from 100 students where 30 are taking maths, 20 are taking chemistry, and 10 are taking maths and chemistry. Find the probability p that the student is taking maths or chemistry. P(M) = 30/100, P(C)=20/100 P(M  C) = 10/100 P(M  C)=P(M) + P(C) - P(M  C) P(M  C)= 30/100+20/100–10/100=2/5

Rule of Multiplication Is used when we want to know the probability that two events occur (Event A intersection Event B). Rule of Multiplication The probability that Events A and B both occur is equal to the probability that Event A occurs times the probability that Event B occurs, given that A has occurred. P(A  B) = P(A) × P(B|A)

Rule of Multiplication A bag contains 6 red marbles and 4 blue marbles. Two marbles are drawn without replacement from the bag. What is the probability that both of the marbles are blue? A = first marble is blue, B = second marble is blue. Therefore, P(A) = 4/10, P(B|A) = 3/9. Using P(A ∩ B) = P(A) P(B|A) P(A ∩ B) = (4/10) * (3/9) = 12/90 = 2/15

Rule of Multiplication A bag contains 6 red marbles and 4 blue marbles. Two marbles are drawn with replacement from the bag. What is the probability that both of the marbles are blue? A = first marble is blue, B = second marble is blue. Therefore, P(A) = 4/10, P(B|A) = 4/10. Using P(A ∩ B) = P(A) P(B|A) P(A ∩ B) = (4/10) * (4/10) = 16/100 = 4/25

Conditional Probability E is an event in S with P(E)>0. Conditional probability of A is defined as the probability that A has occurred after E has occurred. We say the conditional probability of A given E : P(A|E) = P(A  E) P(E) P(A|E) = number of elements in A  E number of elements in E

Example: Conditional Probability Alternatively P(A|E) = number of ways A and E can occur number of ways E can occur Given the sum of a pair of tossed die is 6. E={sum is 6},5 ways = {(1,5),(2,4),(3,3),(4,2),(5,1)} A= {has at least one two},2 ways= {(2,4),(4,2)} P(A|E)=2/5

Example 2: Conditional Probability P(A|E) = number of ways A and E can occur number of ways E can occur From a class has 12 boys and 4 girls, 3 students are selected. What is the probability that they are all boys? P=Comb(12,3)/Comb(16,3)=11/28 Alternatively P=(12/16)(11/15)(10/14) = 11/28

Independence Two events are independent if the occurrence of one of the events gives us no information about whether or not the other event will occur; that is, the events have no influence on each other. We say that two events, A and B, are independent if the probability that they both occur is equal to the product of the probabilities of the two individual events, i.e. P(A  B) = P(A)  P(B) If two events are independent then they cannot be mutually exclusive (disjoint) and vice versa.

Example: Independence Events A and B are independent if P(A ∩ B) = P(A) ∙ P(B) otherwise they are dependent. A coin tossed three times: S={HHH,HHT,HTH,HTT,THH,THT,TTH,TTT} A={first toss head} B={second toss head} C={exactly 2 heads tossed in a row}

Example: Independence Continuing, coin tossed three times: P(A)={HHH,HHT,HTH,HTT}=4/8 (1 st head) P(B)={HHH,HHT,THH,THT}=4/8 (2 nd head) P(C)={HHT,THH} = 1/4 (2 heads in row) P(A  B)=P({HHH,HHT})= 1/4 P(A  C)=P({HHT})= 1/8 P(B  C)=P({HHT,THH})= 1/4

Example: Independence Continuing, coin tossed three times: P(A  B)=P({HHH,HHT})= 1/4 P(A  C)=P({HHT})= 1/8 P(B  C)=P({HHT,THH})= 1/4 P(A)P(B)=(1/2)  (1/2)=(1/4)= P(A  B) P(A)P(C)=(1/2)  (1/4)=(1/8)= P(A  C) P(B)P(C)=(1/2)  (1/4)=(1/8)  P(B  C) Not independent, B and C are dependent.

Repeated Trials The Law of Averages states, in the long run, over repeated trials, random fluctuations eventually average out and the average of our observations will approach the expected value. But at the same time with increasing numbers of observations, the number of observations that differ from what we expect will be larger.

Repeated Trials Let S* be a finite probability space. By n independent or repeated trials we mean the probability space S consisting of all ordered n-tuples of elements of S*, with the probability of n-tuple defined to be the product of the probabilities of its components. P(s 1,s 2,s 3...s n )=P(s 1 )  P(s 2 )    P(s n )

Repeated Trials Let probability space S*={P(a),p(b),P(c)} represents probabilities three runners winning a race. Their probabilities of winning are P(a)=1/2, P(b)=1/3, P(c)=1/6. If there are two races then the sample space S consisting of two repeated trials is: S={aa,ab,ac,ba,bb,bc,ca,cb,cc}

Repeated Trials S={aa,ab,ac,ba,bb,bc,ca,cb,cc} The probability of the sample points of S are: P(aa)=(1/2)  (1/2)=1/4 P(ab)=(1/2)  (1/3)=1/6 P(ac)=(1/2)  (1/6)=1/12 P(ba)=(1/3)  (1/2)=1/6 P(bb)=(1/3)  (1/3)=1/9 P(bc)=(1/3)  (1/6)=1/18 P(ca)=(1/6)  (1/2)=1/12 P(cb)=(1/6)  (1/3)=1/18 P(cc)=(1/6)  (1/6)=1/36 The probability of c winning first race and a the second is P(ca)=1/12 EXCEL =(1/4)+(1/6)+(1/12)+(1/6)+(1/9)+(1/18)+(1/12)+(1/18)+(1/36)

Bernoulli Trials with 2 possible outcomes. A Bernoulli trial is a random experiment in which there are only two possible outcomes - success and failure. If p is the probability of success, then q=1-p is the probability of failure. Often we are interested in the number of successes without considering their order. The probability of exactly k successes in n repeated trials is: b(k,n,p)= p k q n-k

A coin is tossed 6 times, H=success,T=failure. n=6, p=q=1/2 The probability of two heads, ( k=2 ) Binomial coefficient b(2,6,1/2)= (1/2) 2 (1/2) 4 =15/64 Example: Trials with 2 possible outcomes.

Identically Distributed variable Same probability distributions