Presentation is loading. Please wait.

Presentation is loading. Please wait.

© 2002-2003 by Yu Hen Hu 1 ECE533 Digital Image Processing Review of Probability, Random Process, Random Field for Image Processing.

Similar presentations


Presentation on theme: "© 2002-2003 by Yu Hen Hu 1 ECE533 Digital Image Processing Review of Probability, Random Process, Random Field for Image Processing."— Presentation transcript:

1 © 2002-2003 by Yu Hen Hu 1 ECE533 Digital Image Processing Review of Probability, Random Process, Random Field for Image Processing

2 © 2002-2003 by Yu Hen Hu 2 ECE533 Digital Image Processing Probability Models l Experiment: »Throw a dice, toss a coin, … »Each experiment has an outcome »Experiments can be repeated. l Sample space (  ) »The set of all possible outcomes of an experiments l Event »A subset of outcomes in  that has a particular meanings. l Probability of an event A »P(A) = |A|/|  | »|A|: cardinality of A, # of elements in set A. l Example : Card draw »A single card is drawn from a well shuffled deck of playing cards. Find P(drawing an Ace) »  = {1, 2, …, 52}, |  | = 52. »Event A = drawing an Ace. Assume the four Aces cards are labeled as 1, 2, 3, 4, then event A = {1, 2, 3, 4}, |A| = 4 »Thus, P(A) = 4/52 = 1/13

3 © 2002-2003 by Yu Hen Hu 3 ECE533 Digital Image Processing Axioms of a Probability Model l Each outcome  i of an experiment can be assigned to a probability measure P(  i ) such that 0  P(  i )  1. l For fair experiments where each outcome is equally likely to occur, P(  i ) = 1/|  | l In general, the probability of an event, which is a set of outcomes is evaluated as: Given a set A, its corresponding probability measure P(A) has the following properties: 1. P(  ) = 0. The empty set is an impossible event. 2. P(A)  0 for every event A. 3. If A m  A n =  for m  n, then 4. P(  ) = 1. The probability of entire sample space = unity.

4 © 2002-2003 by Yu Hen Hu 4 ECE533 Digital Image Processing Independence Question: Given a fair coin, and a well- shuffled deck of cards. What is the probability of toss the coin and observe a Head AND drawing a Jack of hearts? Answer: P(Head) = ½ P(Jack of hearts) = 1/52. But the events of tossing a coin and drawing a card are independent!. Hence P(Head AND Jack of hearts) = P(Head)  P(Jack of hearts) = 1/104. Independence Two events A and B are statistically independent if P(A  B) = P(A)P(B) Independence of N events Given N events {A n ; 1  n  N}. We say these N events are mutually independent iff where J  {1, 2, …, N} is any subset of the indices

5 © 2002-2003 by Yu Hen Hu 5 ECE533 Digital Image Processing Conditional Probability Let A and B be two events in the same sample space . Given that B has occurred, the conditional probability that A will also occur is defined as: Assuming P(B)  0. Theorem. If A and B are independent events, then P(A|B) = P(A)P(B)/P(B) = P(A) Example A perfect dice is thrown twice. Given that the sum of the two outcomes is 9. What is the probability that the outcome of the first throw is 4? Answer Let the outcome of the first throw is m, the second throw is n. Then B={(m,n); m+n=9, 1  m,n  6} ={(3,6),(4,5),(5,4),(6,3)} A  B={(m,n); m=4,n=5, 1  m,n  6} ={(4,5)} P(A|B) = P(A  B)/P(B) = (1/36)/(4/36) = ¼ Note that P(A) = 1/6.

6 © 2002-2003 by Yu Hen Hu 6 ECE533 Digital Image Processing Law of Total Probability & Bayes’ Rule Law of total probability Let {B n } be a set of events that partitions the sample space  : Then for any event A  , Thus, Bayes’ Rule

7 © 2002-2003 by Yu Hen Hu 7 ECE533 Digital Image Processing Random Variable l A random variable X(  ) is a real-valued function defined for points  in a sample space  l Example: If  is the whole class of students, and  is an individual student, we may define X(  ) as the height of an individual student (in feet) l Question: What is the probability of a student’s height between 5 feet and 6 feet? l Define B=[5, 6]. Our goal is to find the probability P({   : 5  X(  )  6}) = P({   : X(  )  B }) = P({ X  B}) l In general, we are interested in the probability P({ X  B}) or for convenience, P( X  B). »If B = {x o } is a singleton set, we may simply write P(X=x o }. l Example 2.1 P(a  X  b) = P( X  b)  P( X  a) l Example 2.2 P(X=0 or X = 1) = P(X=0) + P(X=1)

8 © 2002-2003 by Yu Hen Hu 8 ECE533 Digital Image Processing Probability Mass Functions (PMF) and Expectations l Probability mass function (PMF) is defined on a discrete random variable X by p X (x i ) = P(X = x i ) Hence, l Joint PMF of X and Y: l Marginal PMF: l Expectations (mean, average):

9 © 2002-2003 by Yu Hen Hu 9 ECE533 Digital Image Processing Moments and Standard Deviation n-th moment: E[X n ] Defined over a real-valued random variable X. Standard Deviation: var(X) Let m = E[X], then Var[X] = E[(X-m) 2 ] = E[X 2 – 2Xm + m 2 ] = E[X 2 ] – 2mE[X] + m 2 = E[X 2 ] – m 2 = E[X 2 ] – (E[X]) 2 Example Find the E[X 2 ] and var(X) of a Bernoulli r.v. X: E[X 2 ] = 0 2 (1 – p) + 1 2 p = p Since E[X] = p, thus, Var(X) = E[X 2 ] – (E[X]) 2 = p – (p) 2 = p(1 – p) Example Let X ~ poisson( ). Since E[X(X – 1)] = 2, we have E[X 2 ] = 2 +. Thus, var(X) = ( 2 + ) – 2 =.

10 © 2002-2003 by Yu Hen Hu 10 ECE533 Digital Image Processing Conditional Probability The conditional probability is defined as follows: In terms of pmf, we have Example Let X = message to be sent (an integer). For X = i, light intensity i is directed at a photo detector. Y ~ Poisson( i ) = # of photo- electrons generated at the detector. Solution: for n = 0, 1, 2, … Thus, P(Y<2|X=i) = P(Y=0|X=i) + P(Y=1|X=i) =

11 © 2002-2003 by Yu Hen Hu 11 ECE533 Digital Image Processing Definitions of Continuous R.V.s Definition: Continuous R.V. Let X(  ) be a random variable defined on a sample space . X is a continuous random variable if Definition: probability density function (pdf) f(x) is a probability density function if

12 © 2002-2003 by Yu Hen Hu 12 ECE533 Digital Image Processing Cumulative Distribution Function Definition: The cumulative distribution function (cdf) of a random variable X is defined by F X (x) = P(X  x) If X is a continuous random var. Properties of CDFs (a) 0  F(x)  1. (b) F(b)  F(a) = P(a  X  b) (c) a < b implies F(a) < F(b) (d) (e) (f) F(x) is right continuous. I.e. (g) (h) P(X=x 0 ) = F(x 0 ) – F(x 0  ) Note that if F(x) is continuous at x 0, F(x 0 +) = F(x 0  ) = F(x 0 ). From (h), P(X=x 0 ) = 0!

13 © 2002-2003 by Yu Hen Hu 13 ECE533 Digital Image Processing Functions of Random Variables Let X be a random variable, and g(X) a real-valued function. Y=g(X) is a new random variable. We want to find P(Y  C) in terms of F X (x). For this, we must find the set B such that To find F Y (y), C = (- , y], or equivalently, Example X: input voltage, a random variable. Y = g(X) = aX + b where a  0 is the gain, and b is offset voltage. Solution: g(x)  y iff x  (y-b)/a for a > 0, and x  (y-b)/a for a < 0. a>0: F Y (y) = F X ((y-b)/a), f Y (y) = dF/dy = (1/a)f X ((y-b)/a) a<0: F Y (y) = 1-F X ((y-b)/a), f Y (y) = dF/dy = (-1/a)f X ((y-b)/a) In summary, f Y (y) = (1/|a|)f X ((y-b)/a)

14 © 2002-2003 by Yu Hen Hu 14 ECE533 Digital Image Processing Random Processes and Random Fields Random Process : A family of random variables X t (  ) For each fixed outcome  , X t (  ) is a function of t (time). For fixed index t, X t (  ) is a random variable defined on . Example a. A jukebox has 6 songs. You roll a dice and based on its outcome to pick a song. Example b. Let t  {0, 1, 2, …}. At each t, toss a coin. X t (  ) = 0 if outcome is tail, = 1 if outcome is head. Random Field: A random field is a random process that is defined on 2D space rather than on 1D (time). For a monochrome image, the intensity of each pixel f(x,y) = X x,y (  ) is modeled as a random variable. For a particular outcome  i, f(x, y) is a deterministic function of x, and y. All results applicable to random processes can be applied to random field.

15 © 2002-2003 by Yu Hen Hu 15 ECE533 Digital Image Processing Mean, Correlation and Covariance Mean If X t is a random process, it mean function is where the expectation is taken w.r.t. pmf or pdf of X t at time t. Correlation Covariance Example a. Denote s i (t) to be the time function of i th song. Then Example d. Given that X 0 = 5. Hence P(X 1 = 4) = 1, m X (1) = 4. P(X 2 = 3) = (4/5)(4/5) = 16/25, P(X 2 = 4) = (4/5)(1/5) + (1/5)(4/5) = 8/25, P(X 2 = 5) = 1/25. Thus, m X (2) = 3(16/25)+4(8/25)+5(1/25) = 17/5

16 © 2002-2003 by Yu Hen Hu 16 ECE533 Digital Image Processing Stationary Process and WSS Any property that depends on the value of {X t } at k index points t 1, t 2, …, t k is completely characterized by the joint pdf (or pmf) of X t1, X t2, …, X tk denoted by (pdf case) f(X(t 1 ), …, X(t k )) Definition Stationary Process {X t } is (strictly) stationary if for any finite set of time points {t 1, t 2, …, t k }, their joint pdf is time invariant. f(X(t 1 ), …, X(t k )) = f(X(t 1 +  ), …, X(t k +  )) Definition Wide-sense stationary {X t } is wide-sense stationary if its first two moments are independent of time. That is, m X (t) = E[X t ] = m X R X (u,v) = R X (u  v) Let u = t + , v = t, we may write R X (u,v) = R X ((t +  )-t) = R X (  )

17 © 2002-2003 by Yu Hen Hu 17 ECE533 Digital Image Processing Power Spectral Density and Power Definition Power Spectral Density S X (f) gives the power density of the process X t distributed over each frequency f. Hence it must be non-negative. Definition Power P X Properties of PSF and Correlation function: a) R(  ) = R(  ). Hence S X (f) is a real valued, even function. b) R(  )  R(0). To prove, use Cauchy-Schwarz inequality: E[UV] 2  E[U 2 ]E[V 2 ] c) S X (f) is real, even and non- negative.

18 © 2002-2003 by Yu Hen Hu 18 ECE533 Digital Image Processing LTI System: A brief review l A system y(t) = L[x(t)] is a mapping of a function x(t) to a function y(t). l L[  ] is a Linear System iff L[ax 1 +bx 2 ] = a L[x 1 ]+b L[x 2 ] l L[  ] is time invariant iff L[x(t+u)] = y(t+u) l A LTI (linear, time invariant) system can be uniquely characterized by its impulse response h(t) = L[  (t)] Given a LTI system y(t) = L[x(t)], y(t) can be obtained via the convolution between x(t) and the impulse response h(t): The Fourier transform of h(t) is called the transfer function

19 © 2002-2003 by Yu Hen Hu 19 ECE533 Digital Image Processing LTI System with WSS input l Let a WSS process X t be the input to a LTI system with impulse response h(t). The output is denoted by Y t. l If E[X t ] = m, then Cross correlation between X t, Y t Define

20 © 2002-2003 by Yu Hen Hu 20 ECE533 Digital Image Processing LSS Input to a LTI System (Cont’d) Define cross PDF as the Fourier transform of R XY (  ), then S XY (f) = H * (f)S X (f) Therefore, Substitute t-s with , we have Taking Fourier transform,


Download ppt "© 2002-2003 by Yu Hen Hu 1 ECE533 Digital Image Processing Review of Probability, Random Process, Random Field for Image Processing."

Similar presentations


Ads by Google