Presentation is loading. Please wait.

Presentation is loading. Please wait.

Probability Review (many slides from Octavia Camps)

Similar presentations


Presentation on theme: "Probability Review (many slides from Octavia Camps)"— Presentation transcript:

1 Probability Review (many slides from Octavia Camps)

2 Intuitive Development Intuitively, the probability of an event a could be defined as: Where N(a) is the number that event a happens in n trials

3 More Formal:  is the Sample Space: –Contains all possible outcomes of an experiment  2  is a single outcome A 2  is a set of outcomes of interest

4 Independence The probability of independent events A, B and C is given by: P(ABC) = P(A)P(B)P(C) A and B are independent, if knowing that A has happened does not say anything about B happening

5 Conditional Probability One of the most useful concepts! A B 

6 Bayes Theorem Provides a way to convert a-priori probabilities to a-posteriori probabilities:

7 Using Partitions: If events A i are mutually exclusive and partition  B 

8 Random Variables A (scalar) random variable X is a function that maps the outcome of a random event into real scalar values   X(  )

9 Random Variables Distributions Cumulative Probability Distribution (CDF): Probability Density Function (PDF): Probability Density Function (PDF):

10 Random Distributions: From the two previous equations:

11 Uniform Distribution A R.V. X that is uniformly distributed between x 1 and x 2 has density function: X1X1X1X1 X2X2X2X2

12 Gaussian (Normal) Distribution A R.V. X that is normally distributed has density function: 

13 Statistical Characterizations Expectation (Mean Value, First Moment): Second Moment:Second Moment:

14 Statistical Characterizations Variance of X: Standard Deviation of X:

15 Mean Estimation from Samples Given a set of N samples from a distribution, we can estimate the mean of the distribution by:

16 Variance Estimation from Samples Given a set of N samples from a distribution, we can estimate the variance of the distribution by:

17 Image Noise Model Additive noise: –Most commonly used

18 Additive Noise Models Gaussian –Usually, zero-mean, uncorrelated Uniform

19 Measuring Noise Noise Amount: SNR =  s /  n Noise Estimation: –Given a sequence of images I 0,I 1, … I N-1

20 Good estimators Data values z are random variables A parameter  describes the distribution We have an estimator  z) of the unknown parameter  If E(  z)   or E(  z) ) = E(   the estimator  z) is unbiased

21 Balance between bias and variance Mean squared error as performance criterion

22 Least Squares (LS) If errors only in b Then LS is unbiased But if errors also in A (explanatory variables)

23 Errors in Variable Model

24 Least Squares (LS) bias Larger variance in  A,,ill-conditioned A, u oriented close to the eigenvector of the smallest eigenvalue increase the bias Generally underestimation

25 (a) (b) Estimation of optical flow (a)Local information determines the component of flow perpendicular to edges (b)The optical flow as best intersection of the flow constraints is biased.

26 Optical flow One patch gives a system:

27 Noise model additive, identically, independently distributed, symmetric noise:


Download ppt "Probability Review (many slides from Octavia Camps)"

Similar presentations


Ads by Google