Mean = 41.21Median = 42.5s = 7.59 x - 1s – x + 1s – Assignment #1.

Slides:



Advertisements
Similar presentations
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 14 From Randomness to Probability.
Advertisements

Chapter 4 Probability and Probability Distributions
COUNTING AND PROBABILITY
From Randomness to Probability
1 Chapter 6: Probability— The Study of Randomness 6.1The Idea of Probability 6.2Probability Models 6.3General Probability Rules.
Chapter 4 Probability.
Chapter 2: Probability.
QMS 6351 Statistics and Research Methods Probability and Probability distributions Chapter 4, page 161 Chapter 5 (5.1) Chapter 6 (6.2) Prof. Vera Adamchik.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 14 From Randomness to Probability.
Basic Concepts and Approaches
Office Probability-Related Concepts How to Assign Probabilities to Experimental Outcomes Probability Rules Discrete Random Variables Continuous.
Stat 1510: Introducing Probability. Agenda 2  The Idea of Probability  Probability Models  Probability Rules  Finite and Discrete Probability Models.
Chapter 4 Probability See.
PROBABILITY & STATISTICAL INFERENCE LECTURE 3 MSc in Computing (Data Analytics)
1-1 Copyright © 2015, 2010, 2007 Pearson Education, Inc. Chapter 13, Slide 1 Chapter 13 From Randomness to Probability.
Chapter 8 Probability Section R Review. 2 Barnett/Ziegler/Byleen Finite Mathematics 12e Review for Chapter 8 Important Terms, Symbols, Concepts  8.1.
Topics Covered Discrete probability distributions –The Uniform Distribution –The Binomial Distribution –The Poisson Distribution Each is appropriately.
Ex St 801 Statistical Methods Probability and Distributions.
Theory of Probability Statistics for Business and Economics.
Econ 3790: Business and Economics Statistics Instructor: Yogesh Uppal
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
LECTURE 15 THURSDAY, 15 OCTOBER STA 291 Fall
Chapter 10: Introducing Probability STAT Connecting Chapter 10 to our Current Knowledge of Statistics Probability theory leads us from data collection.
Introduction to Probability  Probability is a numerical measure of the likelihood that an event will occur.  Probability values are always assigned on.
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
1 1 Slide © 2016 Cengage Learning. All Rights Reserved. Probability is a numerical measure of the likelihood Probability is a numerical measure of the.
Basic Probability Rules Let’s Keep it Simple. A Probability Event An event is one possible outcome or a set of outcomes of a random phenomenon. For example,
LECTURE 14 TUESDAY, 13 OCTOBER STA 291 Fall
Chapter 4 Probability ©. Sample Space sample space.S The possible outcomes of a random experiment are called the basic outcomes, and the set of all basic.
Copyright © 2010 Pearson Education, Inc. Chapter 14 From Randomness to Probability.
Copyright © 2010 Pearson Education, Inc. Chapter 6 Probability.
1 Chapter 4 – Probability An Introduction. 2 Chapter Outline – Part 1  Experiments, Counting Rules, and Assigning Probabilities  Events and Their Probability.
PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY
From Randomness to Probability Chapter 14. Dealing with Random Phenomena A random phenomenon is a situation in which we know what outcomes could happen,
Sixth lecture Concepts of Probabilities. Random Experiment Can be repeated (theoretically) an infinite number of times Has a well-defined set of possible.
Basic Concepts of Probability
Measuring chance Probabilities FETP India. Competency to be gained from this lecture Apply probabilities to field epidemiology.
BIA 2610 – Statistical Methods
Probability. What is probability? Probability discusses the likelihood or chance of something happening. For instance, -- the probability of it raining.
Probability Theory Modelling random phenomena. Permutations the number of ways that you can order n objects is: n! = n(n-1)(n-2)(n-3)…(3)(2)(1) Definition:
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Lecture 6 Dustin Lueker.  Standardized measure of variation ◦ Idea  A standard deviation of 10 may indicate great variability or small variability,
+ Chapter 5 Overview 5.1 Introducing Probability 5.2 Combining Events 5.3 Conditional Probability 5.4 Counting Methods 1.
Lecture 7 Dustin Lueker.  Experiment ◦ Any activity from which an outcome, measurement, or other such result is obtained  Random (or Chance) Experiment.
Probability theory is the branch of mathematics concerned with analysis of random phenomena. (Encyclopedia Britannica) An experiment: is any action, process.
Chapter 4 Probability Concepts Events and Probability Three Helpful Concepts in Understanding Probability: Experiment Sample Space Event Experiment.
PROBABILITY AND BAYES THEOREM 1. 2 POPULATION SAMPLE PROBABILITY STATISTICAL INFERENCE.
Welcome to MM305 Unit 3 Seminar Prof Greg Probability Concepts and Applications.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Chapter 8: Probability: The Mathematics of Chance Probability Models and Rules 1 Probability Theory  The mathematical description of randomness.  Companies.
1 Probability- Basic Concepts and Approaches Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND.
Econ 3790: Business and Economics Statistics Instructor: Yogesh Uppal
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 14 From Randomness to Probability.
Copyright © 2010 Pearson Education, Inc. Chapter 14 From Randomness to Probability.
Probability Distribution. Probability Distributions: Overview To understand probability distributions, it is important to understand variables and random.
AP Statistics From Randomness to Probability Chapter 14.
Probability and Probability Distributions. Probability Concepts Probability: –We now assume the population parameters are known and calculate the chances.
Introduction To Probability
From Randomness to Probability
Welcome to MM305 Unit 3 Seminar Dr
Chapter 4 Probability Concepts
What is Probability? Quantification of uncertainty.
From Randomness to Probability
CHAPTER 12: Introducing Probability
From Randomness to Probability
From Randomness to Probability
Lecture 11 Sections 5.1 – 5.2 Objectives: Probability
Probability Probability underlies statistical inference - the drawing of conclusions from a sample of data. If samples are drawn at random, their characteristics.
Honors Statistics From Randomness to Probability
M248: Analyzing data Block A UNIT A3 Modeling Variation.
Presentation transcript:

Mean = 41.21Median = 42.5s = 7.59 x - 1s – x + 1s – Assignment #1

Course Schedule

Probabilities in Geography The analyses of many problems (daily or geographic) are often based on probabilities, such as: What are the “chances” of having rain over the weekend? What is the “likelihood” that the 100-year flood will occur within the next ten years? How “likely” is it that a pixel on a satellite image is correctly classified or misclassified?

Probability & Probability Distribution We summarize a sample statistically and want to make some inferences about the population (e.g., what proportion of the population has values within a given range) The concept of probability is the key to making statistical inferences by sampling a population What we are doing is trying to ascertain the probability of an event having a given outcome This requires us to be able to specify the distribution of a variable before we can make inferences

Probability & Probability Distributions Previously, we looked at some proportions of area under the normal curve: Source: Earickson, RJ, and Harlin, JM Geographic Measurement and Quantitative Analysis. USA: Macmillan College Publishing Co., p. 100.

Probability & Probability Distributions BUT before we could use the normal curve, we have to find out if this is the right distribution for our variable … While many natural phenomena are normally distributed, there are other phenomena that are best described using other distributions Background on probabilities (terminology & rules), and a few useful distributions: Discrete distributions: Binomial and Poisson Continuous distributions: Normal and its relatives

Probability-Related Concepts An event – Any phenomenon you can observe that can have more than one outcome (e.g., flipping a coin) An outcome – Any unique condition that can be the result of an event (e.g., flipping a coin: heads or tails), a.k.a simple event or sample points Sample space – The set of all possible outcomes associated with an event –e.g., flip a coin – heads (H) and tails (T) –e.g., flip a coin twice – HH, HT, TH, TT

Associated with each possible outcome in a sample space is a probability Probability is a measure of the likelihood of each possible outcome Probability measures the degree of uncertainty Each of the probabilities is greater than or equal to zero, and less than or equal to one The sum of probabilities over the sample space is equal to one Probability-Related Concepts

Probability – Examples Example I – Flip a coin –Two possible outcomes: “heads”, “tails” –Each outcome is equally likely –“heads” and “tails” have the same probability (0.5) –The sum of probabilities over the sample space is one –# of “heads” and # of “tails” will be nearly equal

Probability – Examples Example II – Flip a coin twice –Four outcomes are equally likely –Tosses of the coin are independent –Each outcome has probability 1/4 –The probability of a head on Flip 1 and a head on Flip 2 is 1/2 * 1/2 = 1/4 OutcomeFirst flipSecond flip 1Heads 2 Tails 3 Heads 4Tails

How To Assign Probabilities to Experimental Outcomes? There are numerous ways to assign probabilities to the elements of sample spaces Classical method assigns probabilities based on the assumption of equally likely outcomes Relative frequency method assigns probabilities based on experimentation or historical data Subjective method assigns probabilities based on the assignor’s judgment or belief

Classical Method This approach assumes that each outcome is equally likely If an experiment has n possible outcomes, this method would assign a probability of 1/n to each outcome. It is an appropriate way to assign probabilities to the outcomes in special kinds of experiments

Classical Method Example I: Rolling a die Sample Space: S = {1, 2, 3, 4, 5, 6} Probabilities: Each sample point has a 1/6 chance of occurring.

Classical Method Example II – Flip four coins –Let “0” represent “heads” and “1” represents “tails” –For each toss, the probability of “heads” or “tails” is ½ –Assuming that outcomes of the four tosses are independent from one another –Sixteen possible outcomes × × ½ ½ ½ ½ Probability of each outcome: ½ * ½ * ½ * ½ = 1/16 =

Relative Frequency Method The second way is to assign them on the basis of relative frequencies Example –Given a weather pattern, a meteorologist may note that in 65 out of the last 100 times that such a pattern prevailed there was measurable precipitation the next day –If there were such a weather pattern today, what would the probability of having rain tomorrow be? –The possible outcomes – rain or no rain tomorrow – are assigned probabilities of 0.65 and 0.35, respectively

Subjective Method When extreme weather conditions occur it might be inappropriate to assign probabilities based solely on historical data We can use any data available as well as our experience and intuition, but ultimately a probability value should express our degree of belief that the experimental outcome will occur The best probability estimates often are obtained by combining the estimates from the classical or relative frequency approach with the subjective estimates.

Probability Rules Rules for combining multiple probabilities A useful aid is the Venn diagram - depicts multiple probabilities and their relations using a graphical depiction of sets The rectangle that forms the area of the Venn Diagram represents the sample (or probability) space, which we have defined above Figures that appear within the sample space are sets that represent events in the probability context, & their area is proportional to their probability (full sample space = 1) AB

Probability Rules We can use a Venn diagram to describe the relationships between two sets or events, and the corresponding probabilities The union of sets A and B (written symbolically is A  B) is represented by the areas enclosed by set A and B together, and can be expressed by OR (i.e. the union of the two sets includes any location in A or B) The intersection of sets A and B (written symbolically as A  B) is the area that is overlapped by both the A and B sets, and can be expressed by AND (i.e. the intersection of the two sets includes locations in A AND B) AB AB  

Addition Rule If sets A and B do not overlap in the Venn diagram, the sets are disjoint, and this represents a case of two independent, mutually exclusive events The union of sets A and B here uses the addition rule, where P(A  = P(A) + P(B) You can think of this in terms of areas of the events, where the union in this case is simply the sum of the areas The intersection of sets A and B here results in the empty set (symbolized by  ), because at no point do the circles overlap AB AB P(A  = P(A) + P(B) P(A  = 

Probability Rules The union of sets A and B here uses the addition rule, where P(A  = P(A) + P(B) P(A  = 2/6 + 2/6 P(A  = 4/6 = 2/3 = 0.67 The outcomes represented here are mutually exclusive, thus there is no intersection between sets A and B, thus P(A  =  AB AB P(A  = P(A) + P(B) P(A  =  For example, suppose set A represents a roll of 1 or 2 on a 6-sided die, so P(A)=2/6, and set B represents a roll of 3 or 4, so P(B)=2/6

Probability Rules – General Addition Rule If sets A and B do overlap in the Venn diagram, the sets are independent but not mutually exclusive The union of sets A and B here is P(A  = P(A) + P(B) - P(A  because we do not wish to count the intersection area twice, thus we need to subtract it from the sum of the areas of A and B when taking the union of a pair of overlapping sets The intersection of sets A and B here is calculated by taking the product of the two probabilities, a.k.a. the multiplication rule: AB AB P(A  = P(A) * P(B)  P(A  = P(A) + P(B) - P(A  

General Addition Rule Consider set A to give the chance of precipitation at P(A)=0.4 and set B to give the chance of below freezing temperatures at P(B)=0.7 The intersection of sets A and B here is P(A  = P(A) * P(B) P(A  = 0.4 * 0.7 = 0.28 This expresses the chance of snow at P(A  = 0.28 The union of sets A and B here is P(A  = P(A) + P(B) - P(A  P(A  = – 0.28 = 0.82 This expresses the chance of below freezing temperatures or precipitation occurring at P(A  = 0.82 AB P(A  = P(A) + P(B) - P(A   AB P(A  = P(A) * P(B) 

Complement Consider set A to give the chance of precipitation at P(A)=0.4 and set B to give the chance of below freezing temperatures at P(B)=0.7 The complement of set A is P(A’  = 1 - P(A) P(A’  = 1 – 0.4 = 0.6 This expresses the chance of it not raining or snowing at P(A’  = 0.6 The complement of the union of sets A and B is P(A  ’ = 1 – [P(A) + P(B) - P(A  P(A  ’ = 1 – [ – 0.28] = 0.18 This expresses chance of it neither raining nor being below freezing at P(A  ’ = 0.18 P(A’  = 1 - P(A) P(A  ’ = 1 – [P(A) + P(B) - P(A  AA’ AB P(A  ’

Probability Rules We can also encounter the situation where set A is fully contained within set B, which is equivalent to saying that set A is a subset of set B: For example, set A might represent precipitation events with >= 5 inches, whereas set B denotes any events with >= 1 inch  A is contained with B because anytime A occurs, B occurs as well In probability terms, this situation occurs when outcome B is a necessary precondition for outcome A to occur, although not vice-versa (in which case set B would be contained in set A instead) A B

Probability – Example Example – # of malls within cities City # of Malls A1 B4 C4 D4 E2 F3 We might wonder if we randomly pick one of these six cities, what is the probability (chance) that it will have n malls? Sample Space Each count of the # of malls in a city is an event

Random Variables What we have here is a random variable – defined as a function that associates a unique numerical value with every outcome of an experiment To put this another way, a random variable is a function defined on the sample space  this means that we are interested in all the possible outcomes A random variable X is a rule that assigns a numerical value to each outcome in the sample space of an experiment

Random Variables The value of the random variable will vary from trial to trial as the experiment is repeated We use an uppercase letter to denote a random variable and a lowercase letter to denote a particular value of the variable A random variable can be classified as being either discrete or continuous depending on the numerical values it assumes

Discrete & Continuous Variables Discrete variable – A variable that can take on only a finite number of values –# of malls within cities –# of vegetation types within geographic regions –# population Continuous variable – A variable that can take on an infinite number of values (all real number values) –Elevation (e.g., [500.0, ]) –Temperature (e.g., [10.0, 20.0]) –Precipitation (e.g., [100.0, 500.0]

Probability Distribution & Probability Function The question was: If we randomly pick one of the six cities, what is the probability (or chance) that it will have n malls? To answer this question, we need to form a probability function (probability distribution) from the sample space that gives all values of a random variable and their probabilities Then we can find the probability that a randomly selected city has n malls from the probability function

Probability Function & Probability Distribution The probability distribution for a random variable describes how probabilities are distributed over the values of the random variable In other words, a probability distribution expresses the relative number of times we expect a random variable to assume each and every possible value The probability distribution of a random variable may be represented by a table, a graph, or an equation

Probability Function & Probability Distribution The probability distribution is defined by a probability function, denoted by p(X) or f(x), which provides the probability for each value of the random variable p(X) or f(x) represents the probability function or the probability distribution for the random variable X

Probability Function – An Example Here, the values of x i are drawn from the four outcomes, and their probabilities are the number of events with each outcome divided by the total number of events: City# of Malls A 1 B 4 C 4 D 4 E 2 F 3 x i P(x i ) 11/6 = /6 = /6 = /6 = 0.5 The probability of an outcome P(x i ) = # of times an outcome occurred Total number of events

Probability Function We can plot this probability distribution as a probability function: This plot uses thin lines to denote that the probabilities are massed at discrete values of this random variable x i p(x i ) 11/6 = /6 = /6 = /6 = p(xi)p(xi) 1234 xixi

Probability Mass Functions A discrete random variable can be described by a probability mass function (pmf) A probability mass function is usually represented by a table, graph, or equation The probability of any outcome must satisfy: 0 <= p(X=x i ) <= 1 i = 1, 2, 3, …, k-1, k The sum of all probabilities in the sample space must total one, i.e.

Probability Mass Function Example: # of malls in cities This plot uses thin lines to denote that the probabilities are massed at discrete values of this random variable x i p(X=x i ) 11/6 = /6 = /6 = /6 = p(xi)p(xi) 1234 xixi

Discrete Probability Distribution We can calculate the mean and variance of a discrete probability distribution: We use µ and σ 2 here because the basic idea of a probability distribution is to use a large number of samples to approach the distribution of a population  x i *p(x i ) i=1 i=k   (x i – x) 2 *p(x i ) i=1 i=k   

Continuous Random Variables Continuous random variable can assume all real number values within an interval (e.g., rainfall, pH) The probability distribution of a random continuous variable is described by probability density functions (pdf) A probability density function (pdf) is usually represented by a graph or equation

Again, there are two fundamental requirements for a probability density function (pdf): x area=1 f(x) µ

Probability Density Functions Theoretically, a continuous variable’s range can extend from negative infinity to infinity, e.g. the normal distribution: The tails of the normal distribution’s curve extend infinitely in each direction, but the value of f(x) approaches zero, getting closer and closer, but never reaching zero x area=1 f(x)

The probability of a continuous random variable X within an arbitrary interval is given by: Simply calculate the shaded shaded area  if we know the density function, we could use calculus x f(x) ab

Probability Density Functions Fortunately, we do not need to solve the integral ourselves to practice statistics … instead, if we can match the f(x) up to some known distribution, we can use a table of probabilities that someone else has developed Tables A.2 through A.6 in the epilogue of the Rogerson text (pp ) give probability values for several distributions, including the normal distribution and some related distributions used by various inferential statistics

Probability Density Functions Suppose we are interested in computing the probability of a continuous random variable at a certain value of x (e.g. at d): As the interval from c to d becomes vanishingly narrow, the area below the curve within it becomes vanishingly small x f(x) a b d Can we find the probability of a value occurring at d? p(d) = ? p(x) c d  0 as c  d c No, p(d) = 0 … why? The reasons is: