Welcome to Probability and the Theory of Statistics This class uses nearly every type of mathematics that you have studied so far as well as some possibly.

Slides:



Advertisements
Similar presentations
Chapter 2 Concepts of Prob. Theory
Advertisements

Introduction to Probability Experiments, Outcomes, Events and Sample Spaces What is probability? Basic Rules of Probability Probabilities of Compound Events.
Probability Simple Events
Chapter 2 Probability. 2.1 Sample Spaces and Events.
7 Probability Experiments, Sample Spaces, and Events
Week 21 Basic Set Theory A set is a collection of elements. Use capital letters, A, B, C to denotes sets and small letters a 1, a 2, … to denote the elements.
Chapter 4 Probability and Probability Distributions
COUNTING AND PROBABILITY
Sets: Reminder Set S – sample space - includes all possible outcomes
Section 2 Union, Intersection, and Complement of Events, Odds
MAT 103 Probability In this chapter, we will study the topic of probability which is used in many different areas including insurance, science, marketing,
Copyright © Cengage Learning. All rights reserved.
Chapter Two Probability
Basic Probability Sets, Subsets Sample Space Event, E Probability of an Event, P(E) How Probabilities are assigned Properties of Probabilities.
Chapter 4 Probability.
Chapter 6 Probability.
Basic Concepts and Approaches
Chapter 2. Axioms of Probability
Chapter 1 Basics of Probability.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Conditional Probability and Independence If A and B are events in sample space S and P(B) > 0, then the conditional probability of A given B is denoted.
Chapter 8 Probability Section R Review. 2 Barnett/Ziegler/Byleen Finite Mathematics 12e Review for Chapter 8 Important Terms, Symbols, Concepts  8.1.
Probability.
Rules of the game: Syllabus and time table are available at: Read Appendix.
Copyright © Cengage Learning. All rights reserved. CHAPTER 9 COUNTING AND PROBABILITY.
Topic 2 – Probability Basic probability Conditional probability and independence Bayes rule Basic reliability.
1 Probability. 2 Today’s plan Probability Notations Laws of probability.
Basic Concepts of Discrete Probability (Theory of Sets: Continuation) 1.
Chapter 8: Probability: The Mathematics of Chance Lesson Plan Probability Models and Rules Discrete Probability Models Equally Likely Outcomes Continuous.
Week 15 - Wednesday.  What did we talk about last time?  Review first third of course.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review Instructor: Anirban Mahanti Office: ICT Class.
LECTURE 15 THURSDAY, 15 OCTOBER STA 291 Fall
1 TABLE OF CONTENTS PROBABILITY THEORY Lecture – 1Basics Lecture – 2 Independence and Bernoulli Trials Lecture – 3Random Variables Lecture – 4 Binomial.
LECTURE 14 TUESDAY, 13 OCTOBER STA 291 Fall
Chapter 4 Probability ©. Sample Space sample space.S The possible outcomes of a random experiment are called the basic outcomes, and the set of all basic.
Independence and Bernoulli Trials. Sharif University of Technology 2 Independence  A, B independent implies: are also independent. Proof for independence.
Week 11 What is Probability? Quantification of uncertainty. Mathematical model for things that occur randomly. Random – not haphazard, don’t know what.
Copyright © 2010 Pearson Education, Inc. Chapter 14 From Randomness to Probability.
1 CHAPTERS 14 AND 15 (Intro Stats – 3 edition) PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY.
Copyright © 2010 Pearson Education, Inc. Chapter 6 Probability.
PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY
5.1 Randomness  The Language of Probability  Thinking about Randomness  The Uses of Probability 1.
From Randomness to Probability Chapter 14. Dealing with Random Phenomena A random phenomenon is a situation in which we know what outcomes could happen,
1 CHAPTER 7 PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY.
확률및공학통계 (Probability and Engineering Statistics) 이시웅.
Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.
Section 2 Union, Intersection, and Complement of Events, Odds
Probability: Terminology  Sample Space  Set of all possible outcomes of a random experiment.  Random Experiment  Any activity resulting in uncertain.
Sixth lecture Concepts of Probabilities. Random Experiment Can be repeated (theoretically) an infinite number of times Has a well-defined set of possible.
Lecture 7 Dustin Lueker. 2STA 291 Fall 2009 Lecture 7.
Probability. What is probability? Probability discusses the likelihood or chance of something happening. For instance, -- the probability of it raining.
Lecture 6 Dustin Lueker.  Standardized measure of variation ◦ Idea  A standard deviation of 10 may indicate great variability or small variability,
+ Chapter 5 Overview 5.1 Introducing Probability 5.2 Combining Events 5.3 Conditional Probability 5.4 Counting Methods 1.
Probability theory is the branch of mathematics concerned with analysis of random phenomena. (Encyclopedia Britannica) An experiment: is any action, process.
Basic probability Sep. 16, Introduction Our formal study of probability will base on Set theory Axiomatic approach (base for all our further studies.
PROBABILITY AND BAYES THEOREM 1. 2 POPULATION SAMPLE PROBABILITY STATISTICAL INFERENCE.
1 Probability- Basic Concepts and Approaches Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND.
Basic Probability. Introduction Our formal study of probability will base on Set theory Axiomatic approach (base for all our further studies of probability)
AP Statistics From Randomness to Probability Chapter 14.
1 What Is Probability?. 2 To discuss probability, let’s begin by defining some terms. An experiment is a process, such as tossing a coin, that gives definite.
Essential Ideas for The Nature of Probability
Copyright © Cengage Learning. All rights reserved.
Chapter 4 Probability Concepts
Chapter 6 6.1/6.2 Probability Probability is the branch of mathematics that describes the pattern of chance outcomes.
PROBABILITY AND PROBABILITY RULES
What is Probability? Quantification of uncertainty.
Lecture 11 Sections 5.1 – 5.2 Objectives: Probability
PROBABILITY AND STATISTICS
Chapter 11: Further Topics in Algebra
Presentation transcript:

Welcome to Probability and the Theory of Statistics This class uses nearly every type of mathematics that you have studied so far as well as some possibly new ideas. For some of you, the idea of counting collections of abstract objects may be new. You should not wait for the exam (Exam 2) to make yourself familiar with the ideas involved in this type of counting. The most important aspect of this course involves solving problems--both applied and theoretical problems. In order to become proficient at solving problems, you must work at it every day. Do the homework in a timely fashion! A new feature of the class this fall involves the Challenge Problems. These problems should help to stretch your gray matter.

Probability Why do we need a theory of probability? Areas of application of such a theory include: business/economics physics biology other areas of mathematics such as geometry and number theory

Experiments with known outcomes The set of all possible outcomes of an experiment (or situation) is called the sample space of the experiment (or situation) and is denoted by S. Example. The sample space for predicting tomorrow’s weather might be Example. Suppose we count the number of times a coin is flipped until the first head appears. The set of possible outcomes is the set of positive integers. Note that in the latter example, S is infinite.

Events Any subset E of the sample space S is known as an event. Example. In the previous weather example, possible events include: {cloudy}, {shower, storm}, {storm, cloudy}, {shower, storm, clear, cloudy}, and even the empty set. For any two events E and F, the new event E  F is the set of all outcomes that are in either E or F or in both E and F. E  F is called the union of E and F. Example. {shower, storm}  {storm, cloudy} = {shower, storm, cloudy}.

Events, continued For any two events E and F, the new event EF is the set of all outcomes that are in both E and F. EF is called the intersection of E and F. Example. {shower, storm}{storm, cloudy} = { storm}. If EF =, then E and F are said to be disjoint or mutually exclusive. Question. Which pairs of the following subsets of the positive integers are mutually exclusive? E = set of even numbers, F = set of odd numbers, H = set of multiples of 3, T = set of multiples of 6.

Union of a sequence of events If we have a sequence of events, E 1, E 2, E 3,..., then the union of these events is defined to be that event which consists of all outcomes that are in E n for at least one value of n, n = 1, 2, 3,.... This union is written as Example. Let S = R, the set of real numbers. Let [a, b] = { x  R | a x b } be a closed interval. Let (a, b) = { x  R | a < x < b } be an open interval. Let E n = [–1 + (1/n), 1 – (1/n)]. How would you describe in the simplest possible way?

Intersection of a sequence of events and complement of an event If we have a sequence of events, E 1, E 2, E 3,..., then the intersection of these events is defined to be that event which consists of all outcomes that are in all the events E n, n = 1, 2, 3,.... This intersection is written as Example. Let E n = (–1/n, 1/n). How would you describe in the simplest possible way? For any event E, we define the new event E c, referred to as the complement of E, to consist of all outcomes in the sample space S that are not in E. Example. If S is the set of positive integers and E is the set of even numbers, then E c is the set of _____ numbers.

Containment of events; laws of the algebra of sets For any two events, E and F, if all outcomes in E are also in F, then we say that E is contained in F and we write E  F (or equivalently, F  E). It follows that E = F E  F and F  E. Some of the rules of set algebra: Venn diagrams are useful for showing the relations among sets.

The frequency interpretation of probability is the one of several ways of interpreting the meaning of the concept of probability. According to this interpretation the probability of a certain event is the proportion of times this event occurs when the experiment is conducted a very large number of times. For other interpretations, see: For a fair coin, we say that the probability of a head showing up is 1/2. Under the frequency interpretation of probability, this means that if the coin is tossed a very large number of times, the fraction of times a head is obtained will be approximately 1/2. Although we often use the frequency interpretation when thinking about probability, it does not lend itself to the formation of a theory in which it is possible to prove theorems. Instead, an axiomatic approach due to Kolmogorov is used. Based on these axioms, the frequency interpretation is obtained as a theorem (see Chapter 11: Laws of Large Numbers). Frequency interpretation of probability

Axioms of Probability We take an abstract approach to defining probability by stating some properties (axioms) that probability should have, whatever it is that we mean by probability. Consider an experiment whose sample space is S. For each event E in the sample space S, we assume that a real number P(E) is defined and that the following axioms are satisfied: Axiom 1. 0 P(E) Axiom 2. P(S) = 1 Axiom 3. For any sequence of mutually exclusive events E 1, E 2, E 3,... (that is, events for which E i E j =, i j), We refer to P(E) as the probability of event E.

Some simple propositions which follow from the axioms P( ) = 0 P(E) 1 P(E c ) = 1 – P(E) If E  F, then P(E) P(F) P(E  F) = P(E) + P(F) – P(EF) We call the latter proposition the inclusion-exclusion identity. It has a generalization to n events (see textbook for general case and next slide for n = 4).

Inclusion-Exclusion for n = 4. P(E 1  E 2  E 3  E 4 ) = P(E 1 ) + P(E 2 ) + P(E 3 ) + P(E 4 ) –[P(E 1 E 2 ) + P(E 1 E 3 )+ P(E 1 E 4 )+ P(E 2 E 3 )+ P(E 2 E 4 )+ P(E 3 E 4 )] +[P(E 1 E 2 E 3 ) + P(E 1 E 2 E 4 ) + P(E 1 E 3 E 4 ) + P(E 2 E 3 E 4 )] – P(E 1 E 2 E 3 E 4 )

How to assign probabilities on a finite sample space Consider the sample space S = {1, 2, 3, …, N}. If we assign probabilities p i to singleton events {i}, then we can compute the probability of any event E by adding the probabilities of the elements of E. Example. S = {1, 2, 3, …, 8}. P(E)= p 3 + p 6 + p 7. What condition must satisfy? ∙p 1 ∙p 2 ∙p 3 ∙p 4 ∙p 5 ∙p 6 ∙p 7 ∙p 8 E S

Example for inclusion-exclusion and Venn diagrams Judy is taking two books on her holiday vacation. With probability 0.5 she will like the first book; with probability 0.4 she will like the second book; with probability 0.3 she will like both books. What is the probability she will like neither book? Let B i denote the event that Judy likes book i, i = 1,2. Then the probability that she likes at least one of the books is The probability that she likes neither book is the complement: B1B1 B2B ?? S

Sample spaces having equally likely outcomes For many experiments or situations, it is natural to assume that all outcomes are equally likely to occur. Let S = {1, 2,..., N} be a finite set containing N elements. It is often (but not always) natural to assume that P({1}) = P({2}) =... = P({N}). Then Axioms 2 and 3 imply that P({i}) = 1/N, i = 1, 2,..., N. Next, Axiom 3 implies that for any event E,

Example for “equally likely” Problem. If two dice are rolled, what is the probability that the sum of the upturned faces is 7? In this case, we assume that all 36 possible outcomes are equally likely. The sample space is shown below. For the event that the sum of the dice is 7, there are 6 outcomes (marked with an asterisk). The desired probability is 6/36 = 1/6.

Probability versus Odds We say that the odds in favor of an event A are r to s if Similarly, the odds against an event A are s to r if When the odds in favor of A are r to s, it follows that the odds against A are s to r. Example. What are the odds against drawing an ace from an ordinary deck of 52 cards? The odds in favor of drawing an ace?

Probability as a continuous set function A sequence of events {E n, n 1} is: increasing when E 1  E 2  E 3 ... and then decreasing when E 1  E 2  E 3 ... and then Proposition. If {E n, n 1} is either an increasing or a decreasing sequence of events, then Example. Let S = [–1, 1] and let P([a,b]) = P((a, b)) = (b – a)/2. Let E n = [–1+1/n,1–1/n] and F n = (0, 1/n). What are the values of

Probability as a measure of belief One way of thinking of probability is in terms of relative frequency as described in the textbook. This way of thinking requires events that can be repeated. However, there are times when probability is used for events that can’t be repeated. Suppose you are on a jury and, in your mind, you assign a probability of 0.9 to the event, “the defendant is guilty”. You feel 90% sure that the defendant is guilty. Here, repetition of the event doesn’t make sense. However, the axioms of probability must still hold for this type of subjective probability assignment. The sample space is S = {guilty, innocent}. It follows that P({innocent}) = 1 – P({guilty}) = 0.1. Using the methods of Chapter 3, Conditional Probability and Independence, subjective probabilities can be updated as more information becomes available.

Random Selection of Points from Intervals A point is said to be randomly selected from an interval (a, b) if any two subintervals of (a, b) that have the same length are equally likely to include the point. The probability associated with the event that the subinterval (α, β) contains the point is defined to be (β – α)/(b – a). Problem. Pick a random time from the interval from 1:00am to 2:00am. What is the probability that the time picked is later than 1:45am? Solution. Using the above notation, a = 1, b = 2, α = 1.75, β = 2, so the desired probability is Also, if E is the event that the time is later than 1:45am, then P(E) = 0.25.

Appendix 1: A useful formula proved by math induction Basis step: Put n = 1 on both sides. l.h.s. = a, r.h.s. = a(1-r)/(1-r) = a Since l.h.s. = r.h.s., the Basis Step is complete. Induction hypothesis: Assume the formula is true for n = k. Now add the next term, ar k, to both sides of the latter equation. Get original formula with n replaced by k+1: Simply add fractions.

Appendix 2: An infinite series called a “geometric series” Then S n is called the nth partial sum of the infinite series: The sum of the infinite series is defined to be S where S n  S as n . When this limit exists as a real number, we say that the series converges. When this limit does not exist, we say the series diverges. When |r| < 1, the geometric series converges. Furthermore, S = a/(1-r). This follows from the formula proved in Appendix 1. We write If we multiply through by r, we have

Appendix 3. An example of a countably infinite sample space. Let S = {1, 2, 3, … }. That is, S is the set of positive integers. Assign probabilities to singletons by: Verify that P(S) =1. Let E be the set of odd integers. Evaluate P(E). Let E n = {n, n+1, n+2, …}, n = 1, 2, 3, …. Evaluate P(E n ). Verify that in this example.