Presentation on theme: "Today’s Topic: Lec 3 Prep for Labs 1 & 2 3-D imaging—how to get a nice 2D Image when your samples are 3D. (Deconvolution, with point scanning or with Wide-field."— Presentation transcript:
Today’s Topic: Lec 3 Prep for Labs 1 & 2 3-D imaging—how to get a nice 2D Image when your samples are 3D. (Deconvolution, with point scanning or with Wide-field Imaging.) Getting distances out with FRET– donor quenching, sensitized acceptor emission; and orientation effects. You should have read the book chapter on microscopy AND either Lab 1 or Lab 2 Handout Paul Selvin
Light mostly gets rejected Focused Light creates fluorescence which gets to detector Fluorescence (dark-field) (Last time) Convolution (pinhole) microscopy Way to improve the z-resolution by collecting less out-of-focus fluorescence, via a pinhole and scanning excitation with focused light. Great especially for thick samples, but takes time, complicated optics and requires fluorophores that are very stable. (why?) Confocal microscopy prevents out-of-focus blur from being detected by placing a pinhole aperture between the objective and the detector, through which only in-focus light rays can pass. http://micro.magnet.fsu.edu/primer/digitalimaging/deconv olution/deconintro.html
What if you let light go in (no pinhole) and don’t scan—i.e. use wide-field excitation. Can you mathematically discriminate against out-of-focus light? http://micro.magnet.fsu.edu/primer/digitalimaging/deconv olution/deconintro.html Deconvolution For each focal plane in the specimen, a corresponding image plane is recorded by the detector and subsequently stored in a data analysis computer. During deconvolution analysis, the entire series of optical sections is analyzed to create a three- dimensional montage. By knowing the (mathematical) transfer function, can you do better? Called deconvolution techniques. Common technique: take a series of z-axis and then unscramble them.
Wide-field deconvolution imaging Nikon: http://www.meyerinst.com/imaging- software/autoquant/index.htm There are many ways of doing this You will use this
How to figure out what out-of-focus light gets through? Simplest way: make a 2D sample, scan through it in z and then back-calculate
Deconvolution: 2 Techniques Deblurring and image restoration In contrast, image restoration algorithms are properly termed "three-dimensional" because they operate simultaneously on every pixel in a three-dimensional image stack. Instead of subtracting blur, they attempt to reassign blurred light to the proper in-focus location. Deblurring Algorithms are fundamentally two-dimensiona: they subtract away the average of the nearest neighbors in a 3D stack. For example, the nearest-neighbor algorithm operates on the plane z by blurring the neighboring planes (z + 1 and z - 1, using a digital blurring filter), then subtracting the blurred planes from the z plane. http://micro.magnet.fsu.edu/primer/digitalimaging/deconv olution/deconalgorithms.html + Computationally simple. -Add noise, reduce signal -Sometimes distort image
Deconvolution Image Restoration Instead of subtracting blur (as deblurring methods do), Image Restoration Algorithms attempt to reassign blurred light to the proper in-focus location. This is performed by reversing the convolution operation inherent in the imaging system. If the imaging system is modeled as a convolution of the object with the point spread function, then a deconvolution of the raw image should restore the object. However, never know PSF perfectly. Varies from point-to-point, especially as a function of z and by color. You guess or take some average.
Blind Deconvolution an image reconstruction technique: object and PSF are estimated The algorithm was developed by altering the maximum likelihood estimation procedure so that not only the object, but also the point spread function is estimated. Another family of iterative algorithms uses probabilistic error criteria borrowed from statistical theory. Likelihood, a reverse variation of probability, is employed in the maximum likelihood estimation (MLE) Using this approach, an initial estimate of the object is made and the estimate is then convolved with a theoretical point spread function calculated from optical parameters of the imaging system. The resulting blurred estimate is compared with the raw image, a correction is computed, and this correction is employed to generate a new estimate, as described above. This same correction is also applied to the point spread function, generating a new point spread function estimate. In further iterations, the point spread function estimate and the object estimate are updated together.
Different Deconvolution Algorithms The original (raw) image is illustrated in Figure 3(a). The results of deblurring by a nearest neighbor algorithm appear in Figure 3(b), with processing parameters set for 95 percent haze removal. The same image slice is illustrated after deconvolution by an inverse (Wiener) filter (Figure 3(c)), and by iterative blind deconvolution (Figure 3(d)), incorporating an adaptive point spread function method. http://micro.magnet.fsu.edu/primer/digitalimaging/deconv olution/deconalgorithms.html
Fluorescence Resonance Energy Transfer (FRET) Energy Transfer Donor Acceptor Dipole-dipole Distant-dependent Energy transfer R (Å) E R o 50 Å Spectroscopic Ruler for measuring nm-scale distances, binding Time Look at relative amounts of green & red
FRET Works First shown in 1967 (Haugland and Stryer, PNAS)
How to measure Energy Transfer 1. Donor intensity decrease; 2. Donor lifetime decrease; 3. Acceptor increase. E.T. by increase in acceptor fluorescence and compare it to residual donor emission. Need to compare one sample at two and also measure their quantum yields. E.T. by changes in donor. Need to compare two samples, D-only, and D-A. Where are the donor’s intensity, and excited state lifetime in the presence of acceptor, and ________ are the same but without the acceptor. Time
With a measureable E.T. signal http://mekentosj.com/science/fret/ E.T. leads to decrease in Donor Emission & Increase in Acceptor Emission
Example of FRET Fluorescein Rhodamine Fluorescein: Donor Rhodamine Acceptor
Example of FRET via acceptor-emission From “extracted acceptor emission” You can determine how much direct fluorescence there is by shining a second wavelength at acceptor where donor doesn’t absorb. (For Fl-Rh pair, go to the red, about 550 nm.) Then multiply this by the relative absorbance of Rh at 488/550 nm! Fluorescein Rhodamine Exc = 488 nm
Terms in R o in Angstroms J is the normalized spectral overlap of the donor emission (f D ) and acceptor absorption ( A ). Does donor emit where acceptor absorbs? q D is the quantum efficiency (or quantum yield) for donor emission in the absence of acceptor (q D = number of photons emitted divided by number of photons absorbed). n is the index of refraction (1.33 for water; 1.29 for many organic molecules). is a geometric factor related to the relative orientation of the transition dipoles of the donor and acceptor and their relative orientation in space. Very important; often set = 2/3.
Terms in R o J: Does donor emit where acceptor absorbs? in Angstroms where J is the normalized spectral overlap of the donor emission (f D ) and acceptor absorption ( A ) Spectral Overlap between Donor (CFP) & Acceptor (YFP) Emission R o ≈ 49-52Å.
Orientation Factor where DA is the angle between the donor and acceptor transition dipole moments, D ( A ) is the angle between the donor (acceptor) transition dipole moment and the R vector joining the two dyes. 2 ranges from 0 if all angles are 90 °, to 4 if all angles are 0°, and equals 2/3 if the donor and acceptor rapidly and completely rotate during the donor excited state lifetime. x y z D A R AA DD DA This assumption assumes D and A probes exhibit a high degree of rotational motion Can measure whether donor & acceptor randomize by looking at polarization. 2 is usually not known and is assumed to have a value of 2/3 (Randomized distribution) The spatial relationship between the DONOR emission dipole moment and the ACCEPTOR absorption dipole moment (0 4) 2 often = 2/3