In electromagnetic systems, the energy per photon = h. In communication systems, noise can be either quantum or additive from the measurement system (

Slides:



Advertisements
Similar presentations
In electromagnetic systems, the energy per photon = h. In communication systems, noise can be either quantum or additive from the measurement system (
Advertisements

Probability Distribution
Lecture (7) Random Variables and Distribution Functions.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Discrete Probability Distributions
Continuous Random Variables. L. Wang, Department of Statistics University of South Carolina; Slide 2 Continuous Random Variable A continuous random variable.
Review of Basic Probability and Statistics
Chapter 4 Discrete Random Variables and Probability Distributions
Probability Densities
Prof. Bart Selman Module Probability --- Part d)
Chapter 6 Continuous Random Variables and Probability Distributions
Probability Distributions
Statistics.
Probability and Statistics Review
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Operations Management
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Modern Navigation Thomas Herring
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Chapter 4 Continuous Random Variables and Probability Distributions
Random Variables and Probability Distributions Modified from a presentation by Carlos J. Rosas-Anderson.
Continuous Random Variables and Probability Distributions
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Statistics for Engineer Week II and Week III: Random Variables and Probability Distribution.
4 Continuous Random Variables and Probability Distributions
Theory of Probability Statistics for Business and Economics.
1 Lecture 4. 2 Random Variables (Discrete) Real-valued functions defined on a sample space are random vars. determined by outcome of experiment, we can.
1 Chapter 16 Random Variables. 2 Expected Value: Center A random variable assumes a value based on the outcome of a random event.  We use a capital letter,
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Random Variables and Probability Models
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
DISCRETE RANDOM VARIABLES.
MATH 4030 – 4B CONTINUOUS RANDOM VARIABLES Density Function PDF and CDF Mean and Variance Uniform Distribution Normal Distribution.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Review of Chapter
Chapter 2: Random Variable and Probability Distributions Yang Zhenlin.
High density (& high z material (eg. Calcium Tangstate) photoelectric absorption Screen creates light fluorescent photons. These get captured or trapped.
Topic 5: Continuous Random Variables and Probability Distributions CEE 11 Spring 2002 Dr. Amelia Regan These notes draw liberally from the class text,
Engineering Statistics ECIV 2305
Poisson Random Variable P [X = i ] = e - i / i! i.e. the probability that the number of events is i E [ X ] = for Poisson Random Variable  X 2 =
Random Variables Lecture Lecturer : FATEN AL-HUSSAIN.
Chapter 4 Discrete Random Variables and Probability Distributions
12.SPECIAL PROBABILITY DISTRIBUTIONS
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
Theoretical distributions: the Normal distribution.
4. Overview of Probability Network Performance and Quality of Service.
Copyright © 2009 Pearson Education, Inc. Chapter 16 Random Variables.
Chapter 3 Applied Statistics and Probability for Engineers
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 3: Discrete Random Variables and Their Distributions CIS.
Random variables (r.v.) Random variable
Expectations of Random Variables, Functions of Random Variables
CHAPTER 2 RANDOM VARIABLES.
Chapter 16 Random Variables.
Continuous Random Variables
Physics 114: Lecture 7 Probability Density Functions
STATISTICS Random Variables and Distribution Functions
Random Variables and Probability Models
Some probability density functions (pdfs)
Probability Review for Financial Engineers
Econometric Models The most basic econometric model consists of a relationship between two variables which is disturbed by a random error. We need to use.
ASV Chapters 1 - Sample Spaces and Probabilities
Elementary Statistics
M248: Analyzing data Block A UNIT A3 Modeling Variation.
Continuous Random Variables: Basics
Presentation transcript:

In electromagnetic systems, the energy per photon = h. In communication systems, noise can be either quantum or additive from the measurement system ( receiver, etc). The noise power in a communication system is 4kTB, where k is the Boltzman constant,T is the absolute temperature, and B is the bandwidth of the system. When making a measurement (e.g. measuring voltage in a receiver), noise energy per unit time 1/B can be written as 4kT. The in the denominator comes from the standard deviation of the number of photons per time element. Motivation:

When the frequency > h In the X-ray region where frequencies are on the order of 10 19, hv >> 4kT So X-ray is quantum limited due to the discrete number of photons per pixel. We need to know the mean and variance of the random process that generate x-ray photons, absorb them, and record them. Motivation: Recall: h = 6.63x Js k = 1.38x J/K

Motivation: We will be working towards describing the SNR of medical systems with the model above. We will consider our ability to detect some object ( here shown in blue)that has a different property, in this case attenuation, from the background ( shown here in green). To do so, we have to be able to describe the random processes that will cause the x-ray intensity to vary across the background. I Contrast = ∆I / I SNR = ∆I /  I = CI /  I ∆I Object we are trying to detect Background

The value of a rolled die is a random process. The outcome of rolling the die is a random variable of discrete values. Let’s call the random variable X. We write then that the probability of X being value n is p x (n) = 1/6 1/ Note: Because the probability of all events is equal, we refer to this event as having a uniform probability distribution

1/ Cumulative Probability Distribution Probability Density Function (pdf)

Continuous Random Variables pdf is derivative of cumulative density function 1. cdf is integral of pdf 2. cdf must be between 0 and 1 3. p x (x) > 0 p[x 1 ≤ X ≤ x 2 ] = F(x 2 ) - F(x 1 ) =

Zeroth Order Statistics Not concerned with relationship between events along a random process Just looks at one point in time or space Mean of X,  X  or Expected Value of X, E[X] –Measures first moment of p X (x) Variance of X,   X, or E[(X-  ) 2 ] –Measure second moment of p X (x) Standard deviation

Zeroth Order Statistics Recall E[X] Variance of X or E[(X-  ) 2 ]

p(n) for throwing 2 die /36 1/36 2/36 3/36 4/36 5/36

Fair die Each die is independent Let die 1 experiment result be x and called Random Variable X Let die 2 experiment result be y and called Random Variable Y With independence, p XY (x,y) = p X (x) p Y (y) and E [xy] = ∫ ∫ xy p XY (x,y) dx dy = ∫ x p X (x) dx ∫ y p Y (y) dy = E[X] E[Y] if x,y independent

1. E [X+Y] = E[X] + E[Y] Always 2. E[aX] = aE[X] Always 3.  2 x = E[X 2 ] – E 2 [X] Always 4.  2 (aX)= a 2  2 x Always 5. E[X + c] = E[X] + c 6. Var(X + Y) = Var(X) + Var(Y) only if the X and Y are statistically independent. _

If experiment has only 2 possible outcomes for each trial, we call it a Bernouli random variable. Success: Probability of one is p Failure: Probability of the other is 1 - p For n trials, P[X = i] is the probability of i successes in the n trials X is said to be a binomial variable with parameters (n,p)

Roll a die 10 times. In this game, you win if you roll a 6. Anything else - you lose What is P[X = 2], the probability you win twice?

Roll a die 10 times. 6 you win Anything else - you lose P[X = 2] i.e. you win twice = (10! / 8! 2!) (1/6) 2 (5/6) 8 = (90 / 2) (1/36) (5/6) 8 =

If p is small and n large so that np is moderate, then an approximate (very good) probability is: P[X=i] = e - i / i! Where np = With Poisson random variables, their mean is equal to their variance! E[X] =  x 2 = p[X=i] is the probability exactly i events happen

Let the probability that a letter on a page is misprinted is 1/1600. Let’s assume 800 characters per page. Find the probability of 1 error on the page. Binomial Random Variable Calculation. P [ X = 1] = (800! / 799!) (1/1600) (1599/1600) 799 Very difficult to calculate some of the above terms.

Let the probability that a letter on a page is misprinted is 1/1600. Let’s assume 800 characters per page. Find the probability of 1 error on the page. P [ X = i] = e - i / i! Here i = 1, p = 1/1600 and n =800, so =np = ½ So P[X=1] = 1/2 e –0.5 =.30 What is the probability there is more than one error per page? Hint: Can you determine the probability that no errors exist on the page?

  +   - 

1)Number of Supreme Court vacancies in a year 2) Number of dog biscuits sold in a store each day 3) Number of x-rays discharged off an anode

Signal Power Noise Power If X represents power, SNR = E[X]/  x 2 If X represents an amplitude or a voltage, then X 2 represents power. SNR = E[X]/  x

X-ray photon d x Light photons Scintillating material High density material stop photons through photoelectric absorption Screen creates light fluorescent photons. These get captured or trapped by silver bromide particles on film. Film

 x r h (r) = h(0) cos 3  = h(0) x 3 / (x 2 + r 2 ) 3/2 Since h(0) = K/x 2 K constant x 2 inverse falloff h (r) = Kx / (x 2 + r 2 ) 3/2 Analysis: First calculate spray of light photons from an event at depth x.

H 1  = F {h(r)} ∞ = 2π ∫ h(r) J 0 (  r) r dr where h(r) given on previous page 0 H 1  = 2πKe - 2πx  (from a table Hankel transforms) H  = H 1  = e - 2πx  ( Normalize to DC Value) H 1 (0)

H  = H 1  = e - 2πx  ( Normalize to DC Value) H 1 (0) Notice this depends on x, depth of event into screen. Let’s come up with based on the likelihood of where events will occur in the scintillating material. F(x) = 1 - e -  x for an infinite screen Probability an x-ray photon will interact in distance x x F(x)

p(x) = dF(x) / dx =  e -  x For a screen of thickness d F(x) = 1 - e -  x / 1 - e -  d, then since we are only concerned with captured photons H(p) = ∫ H( ,x) p(x) dx d = (1/ 1 - e -  d )∫ e -2π x   e -  x dx Typical d =.25 mm,  =15/cm for calcium tungstate screen

We would like to describe a figure of merit that would describe a cutoff spatial frequency, akin to the bandwidth of a lowpass filter. For a typical screen with d approximately.25 mm and  =15/cm for a calcium tungstate screen, the bracketed term above can be approximated as 1 for spatial frequencies near the cutoff.

For moderate k, (i.e. a cutoff frequency) Let (1 - e -  d ) =  the capture efficiency of the screen Then k ≈  / (2πp k +  )  2πk  p k = (1 -  k)  For  k << 1  : As the efficiency increases,  k decreases. This is because  increases as d increases. k Cycles/mm kk Figure of Merit

d1d1 d2d2 phosphor Double Emulsion film Intuitively, we would believe this system would work better. Let’s analyze its performance. Let d 1 +d 2 = d so we can compare performance. x

H( ,x) = e -2π  (d 1 -x) for 0 < x < d 1 H( ,x) = e -2π  (x –d 1 ) for d 1 < x < d 1 + d 2 d,d 2 H(  ) = (  / 1- e -  d ) { ∫ e -2π  (d 1 - x) e -  x dx + ∫ e -2π  (x –d 1 ) e -  x dx} 0 d 1 = (  / 1- e -  d ) [ ((e -  d 1 - e -2π  d 1 ) / (2π  -  )) + ((e -  d 1 - e -(2π   d 2 +  d) / (2π  + u) )] Again lets determine a cutoff frequency of  k for a low pass filter that has a response of H(  k ) = k, If we assume d 1 ≈ d 2 = d/2, than we can neglect e -2π  d, e -2π  d 1, e -2π  d 2 because they will be small even for relatively small spatial frequencies. e –ud is also small compared to e –ud 1 Since (2π  ) 2 >> u 2 is true for all but lowest frequency, then

Compare this cutoff frequency to the single screen cutoff. Concentrate on the factor 2e -  d 1 ( the new factor from the single screen film) Since With  ≈ 0.3, improvement is 1.7 Use improvement to lower dose, quicken exam, improve contrast, or some combination. Improvement is

Assuming a circularly symmetric source, I d (x d, y d ) = Kt (x d /M, y d /M) ** (1/m 2 ) s (r d /m) ** h (r d ) Detector response is also circularly symmetric.

I d (u,v) = KM 2 T (Mu,Mv) S(mp) · H(p) H 0 (p) Spatial frequencies at detector Object is of interest though I d (u/M,v/M) = KM 2 T (u,v) S((m/M)p) · H(p/M) Product of 2 Low Pass Filters. H 0 (p) = S [(1-z/d) p ] H ((z/d)p)

As z  d H(p) S(0) As z  0 S(p) H(0)