Fall 2014 Fadwa ODEH (lecture 1). Probability & Statistics.

Slides:



Advertisements
Similar presentations
Probability Distribution
Advertisements

Fundamentals of Probability
1. Probability of an Outcome 2. Experimental Probability 3. Fundamental Properties of Probabilities 4. Addition Principle 5. Inclusion-Exclusion Principle.
1 Some more probability Samuel Marateck © Another way of calculating card probabilities. What’s the probability of choosing a hand of cards with.
Thermo & Stat Mech - Spring 2006 Class 16 1 Thermodynamics and Statistical Mechanics Probabilities.
CHAPTER 13: Binomial Distributions
Unit 32 STATISTICS.
Probability Sample Space Diagrams.
Dice Games & Probabilities. Thermo & Stat Mech - Spring 2006 Class 16 Dice Games l One die has 6 faces. So, the probabilities associated with a dice game.
Random Variables.
8.7 Probability. Ex 1 Find the sample space for each of the following. One coin is tossed. Two coins are tossed. Three coins are tossed.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 4-1 Introduction to Statistics Chapter 5 Random Variables.
1 Econ 240A Power Three. 2 Summary: Week One Descriptive Statistics –measures of central tendency –measures of dispersion Distributions of observation.
1 Probability Distributions GTECH 201 Lecture 14.
1 Econ 240A Power Three. 2 Summary: Week One Descriptive Statistics –measures of central tendency –measures of dispersion Exploratory data Analysis –stem.
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
Introduction Previously, we have worked with experiments and probabilities that have resulted in two outcomes: success and failure. Success is used to.
Problem A newly married couple plans to have four children and would like to have three girls and a boy. What are the chances (probability) their desire.
1 CY1B2 Statistics Aims: To introduce basic statistics. Outcomes: To understand some fundamental concepts in statistics, and be able to apply some probability.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Independence and Bernoulli.
1. Population Versus Sample 2. Statistic Versus Parameter 3. Mean (Average) of a Sample 4. Mean (Average) of a Population 5. Expected Value 6. Expected.
Simple Mathematical Facts for Lecture 1. Conditional Probabilities Given an event has occurred, the conditional probability that another event occurs.
Copyright © Cengage Learning. All rights reserved. Elementary Probability Theory 5.
Probability The definition – probability of an Event Applies only to the special case when 1.The sample space has a finite no.of outcomes, and 2.Each.
Probability Section 7.1.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Worked examples and exercises are in the text STROUD PROGRAMME 28 PROBABILITY.
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
Binomial Experiment A binomial experiment (also known as a Bernoulli trial) is a statistical experiment that has the following properties:
Lecture 3. Combinatorics, Probability and Multiplicity (Ch. 2 )
Chapter 12 Probability. Chapter 12 The probability of an occurrence is written as P(A) and is equal to.
Introduction to Probability © Christine Crisp “Teach A Level Maths” Statistics 1.
Probability Probability is the measure of how likely an event is. An event is one or more outcomes of an experiment. An outcome is the result of a single.
Computing Fundamentals 2 Lecture 6 Probability Lecturer: Patrick Browne
1.4 Equally Likely Outcomes. The outcomes of a sample space are called equally likely if all of them have the same chance of occurrence. It is very difficult.
Slide 5-1 Chapter 5 Probability and Random Variables.
Copyright © Cengage Learning. All rights reserved. Elementary Probability Theory 5.
Probability and Simulation Rules in Probability. Probability Rules 1. Any probability is a number between 0 and 1 0 ≤ P[A] ≤ 1 0 ≤ P[A] ≤ 1 2. The sum.
Probability Basics Section Starter Roll two dice and record the sum shown. Repeat until you have done 20 rolls. Write a list of all the possible.
Sixth lecture Concepts of Probabilities. Random Experiment Can be repeated (theoretically) an infinite number of times Has a well-defined set of possible.
Probability Theory Modelling random phenomena. Permutations the number of ways that you can order n objects is: n! = n(n-1)(n-2)(n-3)…(3)(2)(1) Definition:
+ Chapter 5 Overview 5.1 Introducing Probability 5.2 Combining Events 5.3 Conditional Probability 5.4 Counting Methods 1.
Probability Any event occurring as a result of a random experiment will usually be denoted by a capital letter from the early part of the alphabet. e.g.
Chapter 6: Random Variables
STROUD Worked examples and exercises are in the text Programme 29: Probability PROGRAMME 29 PROBABILITY.
Probability How likely it is that something will happen.
Lecture 19 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Business Statistics, A First Course (4e) © 2006 Prentice-Hall, Inc. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Business Statistics,
Random Variables Lecture Lecturer : FATEN AL-HUSSAIN.
Section 7.3. Why we need Bayes?  How to assess the probability that a particular event occurs on the basis of partial evidence.  The probability p(F)
Dr Hidayathulla Shaikh Probability. Objectives At the end of the lecture student should be able to Define probability Explain probability Enumerate laws.
Probability IIntroduction to Probability ASatisfactory outcomes vs. total outcomes BBasic Properties CTerminology IICombinatory Probability AThe Addition.
Basic ideas of statistical physics
Virtual University of Pakistan
Adding Probabilities 12-5
Introduction Previously, we have worked with experiments and probabilities that have resulted in two outcomes: success and failure. Success is used to.
Dice Games & Probabilities
4 Elementary Probability Theory
Physics 114: Lecture 7 Probability Density Functions
Business Statistics Topic 4
Probability Normal Distribution Sampling and Sample size
4 Elementary Probability Theory
Dice Games & Probabilities
Probability Probability underlies statistical inference - the drawing of conclusions from a sample of data. If samples are drawn at random, their characteristics.
Probability 14.1 Experimental Probability 14.2 Principles of Counting
Binomial Distribution Prof. Welz, Gary OER –
Lecture 2 Binomial and Poisson Probability Distributions
I flip a coin two times. What is the sample space?
Discrete Random Variables: Joint PMFs, Conditioning and Independence
Chapter 11 Probability.
Presentation transcript:

Fall 2014 Fadwa ODEH (lecture 1)

Probability & Statistics

Tossing a pair of dice For one die, the probability of any face coming up is the same, 1/6. Therefore, it is equally probable that any number from one to six will come up. For two dice, what is the probability that the total will come up 2, 3, 4, etc up to 12?

To calculate the probability of a particular outcome, count the number of all possible results. Then count the number that give the desired outcome. The probability of the desired outcome is equal to the number that gives the desired outcome divided by the total number of outcomes. Hence, 1/6 for one die.

List all possible outcomes (36) for a pair of dice. Total CombinationsHow Many , , 3+1, , 4+1, 2+3, , 5+1, 2+4, 4+2, , 6+1, 2+5, 5+2, 3+4, , 6+2, 3+5, 5+3, , 6+3, 4+5, , 6+4, , Sum = 36

Each possible outcome is called a “microstate”. The combination of all microstates that give the same number of spots is called a “macrostate”. The macrostate that contains the most microstates is the most probable to occur.

Combining Probabilities If a given outcome can be reached in two (or more) mutually exclusive ways whose probabilities are p A and p B, then the probability of that outcome is: p A + p B This is the probability of having either A or B If a given outcome represents the combination of two independent events, whose individual probabilities are p A and p B, then the probability of that outcome is: p A × p B This is the probability of having both A and B

Examples Paint two faces of a die red. When the die is thrown, what is the probability of a red face coming up? Throw two normal dice. What is the probability of two sixes coming up?

Let p the probability of success (or desired event or outcome which is here 1/6 for one die). And let q the probability of failure (or undesired event or outcome which is here 5/6 for one die) p + q = 1, or q = 1 – p When two dice are thrown, what is the probability of getting only one six?

Probability of the six on the first die and not the second is: Probability of the six on the second die and not the first is the same, so:

Probability of no sixes coming up is: The sum of all three probabilities is: p(2) + p(1) + p(0) = 1

pp+(pq+pq)+qq = 1 p² + 2pq + q² =1 (p + q)² = 1 The exponent is the number of dice (or tries). Is this general?

Three Dice Example (p + q)³ = 1 p³ + 3p²q + 3pq² + q³ = 1 p(3) + p(2) + p(1) + p(0) = 1 It works! It must be general?! (p + q) N = 1

Binomial Distribution Probability of n successes in N attempts (p + q) N = 1 where, q = 1 – p.

Thermodynamic Probability The term with all the factorials in the previous equation is the number of microstates that will lead to the particular macrostate. It is called the “thermodynamic probability”, w n.

Microstates The total number of microstates is: For a very large number of particles

Mean (Average) of Binomial Distribution

Standard Deviation (σ)

For a Binomial Distribution

Coins Toss 6 coins. Probability of n heads: so total number N choose n from it, could be written as

For Six Coins

For 100 Coins

For 1000 Coins

Multiple Outcomes We want to calculate lnW

Stirling’s Approximation

Number Expected Toss 6 coins N times. Probability of n heads: Number of times n heads is expected is: n = N P(n)

Example: compute the multiplicities of macrostates in an elementary (quantum!) model of a paramagnet We can view the paramagnet as N magnetic moments each of which can be in 2 states either pointing parallel or anti-parallel to some given axis (determined, e.g., by an applied magnetic field). These states are referred to as “up" and “down", respectively. The total magnetization M along the given axis of the paramagnet is then proportional to the difference N up -N down = 2N up -N. Evidently, the macrostate specied by M has multiplicity given by the number of ways of choosing Nup magnetic moments to be \up" out of a total of N magnetic moments. We have So paramagnet is like tossing a coin