Presentation is loading. Please wait.

Presentation is loading. Please wait.

Sets, Combinatorics, Probability, and Number Theory

Similar presentations


Presentation on theme: "Sets, Combinatorics, Probability, and Number Theory"— Presentation transcript:

1 Sets, Combinatorics, Probability, and Number Theory
Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co. MSCS Slides Probability

2 Introduction to Finite Probability
If some action can produce Y different outcomes and X of those Y outcomes are of special interest, we may want to know how likely it is that one of the X outcomes will occur. For example, what are the chances of: Getting “heads” when a coin is tossed? The probability of getting heads is “one out of two,” or 1/2. Getting a 3 with a roll of a die? A six-sided die has six possible results. The 3 is exactly one of these possibilities, so the probability of rolling a 3 is 1/6. Drawing either the ace of clubs or the queen of diamonds from a standard deck of cards? A standard card deck contains 52 cards, two of which are successful results, so the probability is 2/52 or 1/26. Section 3.5 Probability

3 Introduction to Finite Probability
The set of all possible outcomes of an action is called the sample space S of the action. Any subset of the sample space is called an event. If S is a finite set of equally likely outcomes, then the probability P(E) of event E is defined to be: For example, the probability of flipping a coin twice and having both come up as heads is: Probability involves finding the size of sets, either of the sample space or of the event of interest. We may need to use the addition or multiplication principles, the principle of inclusion and exclusion, or the formula for the number of combinations of r things from n objects. Section 3.5 Probability

4 Introduction to Finite Probability
Using the definition of P(E) as seen in the previous slide, we can make some observations for any events E1 and E2 from a sample space S of equally likely outcomes: Section 3.5 Probability

5 Probability Distributions
A way to look at problems where not all outcomes are equally likely is to assign a probability distribution to the sample space. Consider each distinct outcome in the original sample space as an event and assign it a probability. If there are k different outcomes in the sample space and each outcome xi is assigned a probability p(xi), the following rules apply: 0  p(xi)  1 Because any probability value must fall within this range. The union of all of these k disjoint outcomes is the sample space S, and the probability of S is 1. Section 3.5 Probability

6 Conditional Probability
Given events E1 and E2, the conditional probability of E2 given E1, P(E2E1), is: For example, in a drug study of a group of patients, 17% responded positively to compound A, 34% responded positively to compound B, and 8% responded positively to both. The probability that a patient responded positively to compound B given that he or she responded positively to A is: P(B|A) = P(A ∩ B) / P(A) = / @ 0.47 Section 3.5 Probability

7 Conditional Probability
If P(E2E1) = P(E2), then E2 is just as likely to happen whether E1 happens or not. In this case, E1 and E2 are said to be independent events. Then P(E1  E2) = P(E1) * P(E2) This can be extended to any finite number of independent events and can also be used to test whether events are independent. Section 3.5 Probability

8 Expected Value If the values in the sample space are not numerical, we may find a function X: S  R that associates a numerical value with each element in the sample space. Such a function is called a random variable. Given a sample space S to which a random variable X and a probability distribution p have been assigned, the expected value, or weighted average, of the random variable is: Section 3.5 Probability

9 Average Case Analysis of Algorithms
Expected value may help give an average case analysis of an algorithm, i.e., tell the expected “average” amount of work performed by an algorithm. Let the sample space S be the set of all possible inputs to the algorithm. We assume that S is finite. Let the random variable X assign to each member of S the number of work units required to execute the algorithm on that input. And let p be a probability distribution on S, The expected number of work units is given as: Section 3.5 Probability


Download ppt "Sets, Combinatorics, Probability, and Number Theory"

Similar presentations


Ads by Google