Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 1: Random Events and Probability

Similar presentations


Presentation on theme: "Chapter 1: Random Events and Probability"— Presentation transcript:

1 Chapter 1: Random Events and Probability
Department of Statistics Huang Xudong,Ph.D

2 §1.1 Random event

3 1.1.1 Random Experiments The basic notion in probability is that of a random experiment: an experiment whose outcome cannot be determined in advance, but is nevertheless still subject to analysis.

4 Examples of random experiments are:
1. tossing a die, 2. measuring the amount of rainfall in Brisbane in January, 3. counting the number of calls arriving at a telephone exchange during a fixed time period, 4. selecting a random sample of fifty people and observing the number of left-handers, 5. choosing at random ten people and measuring their height.

5 1.1.2 Sample Space Definition The sample space Ω of a random experiment is the set of all possible outcomes of the experiment.

6 Examples of random experiments with their sample spaces are:
1. Cast two dice consecutively, 2. The lifetime of a machine (in days), 3. The number of arriving calls at an exchange during a specified time interval, 4. The heights of 10 selected people.

7 Discrete and continuous sample spaces
Definition: A sample space is finite if it has a finite number of elements. Definition: A sample space is discrete if there are “gaps” between the different elements, or if the elements can be “listed”, even if an infinite list (eg. 1, 2, 3, . . .). In mathematical language, a sample space is discrete if it is countable. Definition: A sample space is continuous if there are no gaps between the elements,so the elements cannot be listed (eg. the interval [0, 1]).

8 1.1.3 Events So far, we have introduced the sample space, Ω,which lists all possible outcomes of a random experiment, and might seem unexciting. However, Ω is a set. It lays the ground for a whole mathematical formulation of randomness, in terms of set theory. The next concept that you would need to formulate is that of something that happens at random, or an event. How would you express the idea of an event in terms of set theory?

9 Definition of events Definition: An event is a subset of the sample space. That is, any collection of outcomes forms an event. Events will be denoted by capital letters A,B,C,.... Note:We say that event A occurs if the outcome of the experiment is one of the elements in A. Note: Ω is a subset of itself, so Ω is an event. The empty set, ∅ = {}, is also a subset of Ω. This is called the null event, or the event with no outcomes.

10 Examples of events are:
1. The event that the sum of two dice is 10 or more, 2. The event that a machine lives less than 1000 days, 3. The event that out of fifty selected people, five are left-handed,

11 Combining Events Formulating random events in terms of sets gives us the power of set theory to describe all possible ways of combining or manipulating events. For example, we need to describe things like coincidences (events happening together), alternatives, opposites, and so on. We do this in the language of set theory.

12 Example: Suppose our random experiment is to pick a person in the class and seewhat form(s) of transport they used to get to campus today. This sort of diagram representing events in a sample space is called a Venn diagram.

13 1. Alternatives: the union ‘or’ operator
Definition: Let A and B be events on the same sample space Ω: so A⊂Ω and B⊂Ω. The union of events A and B is written A∪B, and is given by

14 2. Concurrences and coincidences: the intersection ‘and’ operator
Definition: The intersection of events A and B is written A ∩B and is given by

15 3. Opposites: the complement or ‘not’ operator
Definition: The complement of event A is written and is given by

16 Examples: Experiment: Pick a person in this class at random.
Sample space: Ω = {all people in class}. Let event A =“person is male” and event B = “person travelled by bike today”.

17 Suppose I pick a male who did not travel by bike
Suppose I pick a male who did not travel by bike. Say whether the following events have occurred:

18 Properties of union, intersection, and complement

19 Distributive laws

20 1.1.4 Partitioning sets and events

21 Examples:

22 Partitioning an event A

23 §1.2 Frequency and probability

24 1.2.1 Frequency Consider performing our experiment a large number n times and counting the number of those times when A occurs. The relative frequency of A is then defined to be When is the number of times that A occurs. Properties of frequency:

25 1.2.2 Probability: a way of measuring sets
Remember that you are given the job of building the science of randomness. This means somehow ‘measuring chance’. It was clever to formulate our notions of events and sample spaces in terms of sets: it gives us something to measure. ‘Probability’, the name that we give to our chance-measure, is a way of measuring sets.

26 Most of this course is about probability distributions.
A probability distribution is a rule according to which probability is apportioned,or distributed, among the different sets in the sample space. At its simplest, a probability distribution just lists every element in the sample space and allots it a probability between 0 and 1, such that the total sum of probabilities is 1.

27 Discrete probability distributions

28 Continuous probability distributions
On a continuous sample space Ω, e.g. Ω = [0, 1], we can not list all the elements and give them an individual probability. We will need more sophisticated methods detailed later in the course. However, the same principle applies. A continuous probability distribution is a rule under which we can calculate a probability between 0 and 1 for any set, or event, A ⊆ Ω.

29 1.2.3 Probability Axioms For any sample space, discrete or continuous, all of probability theory is based on the following three definitions, or axioms. If our rule for ‘measuring sets’ satisfies the three axioms, it is a valid probability distribution.

30 Note: The axioms can never be ‘proved’: they are definitions.
Note: Remember that an EVENT is a SET: an event is a subset of the sample space.

31 1.2.3 Probabilities of combined events
In Section 1.3 we discussed unions, intersections, and complements of events. We now look at the probabilities of these combinations. Everything below applies to events (sets) in either a discrete or a continuous sample space.

32 1. Probability of a union Let A and B be events on a sample space Ω. There are two cases for the probability of the union A∪B: 1. A and B are mutually exclusive (no overlap): i.e. A ∩ B = ∅. 2. A and B are not mutually exclusive: A ∩ B = ∅.

33

34

35 Explanation

36 2. Probability of an intersection
There is no easy formula for P(A ∩ B). We might be able to use statistical independence (Section 1.16). If A and B are not statistically independent, we often use conditional probability (Section 1.10.)

37 3. Probability of a complement

38 1.2.4 The Partition Theorem

39 The Partition Theorem.

40 1.2.5 Examples of basic probability calculations
300 Australians were asked about their car preferences in Of the respondents, 33% had children. The respondents were asked what sort of car they would like if they could choose any car at all. 13% of respondents had children and chose a large car. 12% of respondents did not have children and chose a large car. Find the probability that a randomly chosen respondent: (a) would choose a large car; (b) either has children or would choose a large car (or both).

41 First formulate events:

42 Respondents were also asked their opinions on car reliability and fuel consumption. 84% of respondents considered reliability to be of high importance, while 40% considered fuel consumption to be of high importance. Formulate events: R = “considers reliability of high importance”, F = “considers fuel consumption of high importance”.

43

44 Probability that respondent considers BOTH reliability AND fuel consumption of high importance.

45 (f) Find the probability that a respondent considered reliability, but not fuel consumption, of high importance.

46 1.2.6 Formal probability proofs: nonexaminable

47 i)

48

49 §1.3 Conditional probability

50 1.3.1 Conditional Probability
Conditioning is another of the fundamental tools of probability: probably the most fundamental tool. It is especially helpful for calculating the probabilities of intersections, such as P(A∩B), which themselves are critical for the useful Partition Theorem. Additionally, the whole field of stochastic processes is based on the idea of conditional probability. What happens next in a process depends, or is conditional, on what has happened beforehand.

51 Dependent events Suppose A and B are two events on the same sample space. There will often be dependence between A and B. This means that if we know that B has occurred, it changes our knowledge of the chance that A will occur.

52 Example: Toss a dice once.

53 Conditioning as reducing the sample space

54 Definition of conditional probability
Conditional probability provides us with a way to reason about the outcome of an experiment based on partial about the outcome of an experiment, based on partial information.

55 Conditional Probabilities Satisfy the Three Axioms

56 Conditional Probabilities Satisfy General Probability Laws

57 Simple Example using Conditional Probabilities

58 The Multiplication Rule

59 1.3.2 Multiplication (Chain) Rule: Example
Example Three cards are drawn from an ordinary 52-card deck without replacement (drawn cards are not placed back in the deck). We wish to find the probability that none of the three cards is a “heart”.

60 1.3.3 Total Probability Theorem

61 Example Using Total Probability Theorem
You enter a chess tournament where your probability of winning a game is 0.3 against half the players(call them type 1),0.4 against a quarter of the player (call them type 2), You play a game against a randomly chosen opponent. What is the probability of winning?

62 1.3.4 Bayes’ Theorem: inverting conditional probabilities

63 Example The False-Positive Puzzle.

64 §1.4 Independence of the events

65 1.4.1 Statistical Independence

66 We can extend the definition to arbitrarily many events:

67 Statistical independence for calculating the probability of an intersection
In section 1.3 we said that it is often hard to calculate P(A∩ B). We usually have two choices.

68 Pairwise independence does not imply mutual independence

69


Download ppt "Chapter 1: Random Events and Probability"

Similar presentations


Ads by Google