Presentation is loading. Please wait.

Presentation is loading. Please wait.

Wavefront Sensing I Richard Lane Department of Electrical and Computer Engineering University of Canterbury Christchurch New Zealand.

Similar presentations


Presentation on theme: "Wavefront Sensing I Richard Lane Department of Electrical and Computer Engineering University of Canterbury Christchurch New Zealand."— Presentation transcript:

1 Wavefront Sensing I Richard Lane Department of Electrical and Computer Engineering University of Canterbury Christchurch New Zealand

2 Location

3 Astronomical Imaging Group past and present Dr Richard Lane Professor Peter Gough Associate Professor P. J. Bones Associate Professor Peter Cottrell Professor Richard Bates Dr Bonnie Law Dr Roy Irwan Dr Rachel Johnston Dr Marcos van Dam Dr Valerie LeungRichard Clare Yong ChewJudy Mohr

4 Contents Session 1 – Principles Session 2 – Performances Session 3 – Wavefront Reconstruction for 3D

5 Principles of wavefront sensing Introduction Closed against open loop wavefront sensing Nonlinear wavefront sensing Shack-Hartmann Curvature Geometric Conclusions

6 Imaging a star

7 The effect of turbulence

8 Adaptive Optics system Wavefront sensor Image plane Deformable mirror Distorted incoming wavefront telescope

9 Closed loop system Reduces the effects of disturbances such as telescope vibration, modelling errors by the loop gain Does not inherently improve the noise performance unless the closed loop measurements are easier to make Design limited by stability constraints

10 Postprocessing system feedforward compensation Wavefront sensor Detector plane Distorted incoming wavefront telescope Fixed mirror Computer Image

11 Open loop system (SPID) Sensitive to modelling errors No stability issues with computer post processing Problem is not noise but errors in modelling the system time Temporal coherence of the atmosphere T

12 Modelling the problem (step1) The relationship between the measured data and the object and the point spread function is linear Data object convolution point spread function noise (psf) A linear relationship would mean that if we multiply the input by α we multiply the output by α. The output doesn’t change form

13 Modelling the problem (step 2) The relationship between the phase and the psf is non linear psf Fourier magnitude phase correlation transform

14 Correct MAP estimate Wrapped ambiguity ML estimation MAP estimation Phase retrieval Nonlinearity caused by 2 p wrapping interacting with smoothing

15 Role of typical wavefront sensor To produce a linear relationship between the measurements and the phase –Speeds up reconstruction –Guarantees a solution –Degrades the ultimate performance phase weighting basis function

16 Solution is by linear equations Measurement Interaction Basis function vector matrix Coefficents i th column of Θ corresponds to the measurement that would occur if the phase was the i th basis function Three main issues –What has been lost in linearising? –How well you can solve the system of equations? –Is it the right equations?

17 The effect of turbulence There is a linear relationship between the mean slope of the phase in a direction and the displacement of the image in that direction.

18 Trivial example There is a linear relationship between the mean slope and the displacement of the centroid Measurements are the centroids of the data Interaction matrix is the scaled identity Reconstruct the coefficients of the tip and tilt

19 Quality of the reconstruction The centroid proportional to the mean slope (Primot el al, Welsh et al). The best Strehl requires estimating the least mean square (LMS) phase (Glindemann). To distinguish the mean and LMS slope you need to estimate the coma and higher order terms Mean slope LMS slope Phase

20 Coma distortion Detected image Peak value is better than the centroid for optimising the Strehl Impractical for low light data Difference between the lms and mean tilt Ideal image

21 Where to from here The real problem is how to estimate higher aberration orders. Wavefront sensor can be divided into: – pupil plane techniques, that measure slopes (curvatures) in the divided pupil plane, Shack-Hartmann Curvature (Roddier), Pyramid (Ragazonni) Lateral Shearing Interferometers –Image plane techniques that go directly from data in the image plane to the phase (nonlinear) Phase diversity (Paxman) Phase retrieval

22 Geometric wavefront sensing Pyramid, Shack-Hartmann and Curvature sensors are all essentially geometric wavefront sensors Rely on the fact that light propagates perpindicularly to the wavefront. A linear relationship between the displacement and the slope Essentially achromatic

23 Geometric optics model A slope in the wave-front causes an incoming photon to be displaced by Model is independent of wavelength and spatial coherence. z W(x) xx

24 Generalized wave-front sensor This is the basis of the two most common wave-front sensors. Converging lens Aberration Focal plane Curvature sensor Shack- Hartmann

25 Trade-off For fixed photon count, you trade off the number of modes you can estimate in the phase screen against the accuracy with which you can estimate them To estimate a high number of modes you need good resolution in the pupil plane To make the estimate accurately you need good resolution in the image plane

26 Properties of a wave-front sensor Linearization: want a linear relationship between the wave-front and the measurements. Localization: the measurements must relate to a region of the aperture. Broadband: the sensor should operate over a wide range of wavelengths.  Geometric Optics regime

27 Explicit division of the pupil Direct image Shack-Hartmann

28 Shack-Hartmann sensor Subdivide the aperture and converge each subdivision to a different point on the focal plane. A wave-front slope, W x, causes a displacement of each image by zW x.

29 Fundamental problem Resolution in the pupil plane is inversely proportional to the resolution in the image plane You can have good resolution in one but not both (Uncertainty principle) D w Pupil Image

30 Loss of information due to subdivision Cannot measure the average phase difference between the apertures Can only determine the mean phase slope within an aperture As the apertures become smaller the light per aperture drops As the aperture size drops below r0 (Fried parameter) the spot centroid becomes harder to measure

31 Subdivided aperture

32 Implicit subdivision If you don’t image in the focal plane then the image looks like a blurred version of the aperture If it looks like the aperture then you can localise in the aperture

33 Explanation of the underlying principle If there is a deviation from the average curvature in the wavefront then on one side the image will be brighter than the other If there is no curvature from the atmosphere then it is equally bright on both sides of focus.

34 Slope based analysis of the curvature sensor The displacement of light from one pixel to its neighbour is s determined by the slope of the wavefront

35 Slope based analysis of the curvature sensor The signal is the difference between two slope signals →Curvature

36 Phase information localisation in the curvature sensor Diffraction blurring + geometric expansion

37 Localization comes from the short effective propagation distance, Linear relationship between the curvature in the aperture and the normalized intensity difference: Broadband light helps reduce diffraction effects. Curvature sensing

38 Curvature sensing signal Simulated intensity measurement Curvature sensing estimate The intensity signal gives an approximate estimate of the curvature. Two planes help remove scintillation effects

39 Irradiance transport equation Linear approximation gives

40 Solution inside the boundary There is a linear relationship between the signal and the curvature. The sensor is more sensitive for large effective propagation distances.

41 Solution at the boundary (mean slope)  If the intensity is constant at the aperture, H(z) = Heaviside function I1I2I1- I2I1I2I1- I2

42 The wavefront also changes As the wave propagates, the wave-front changes according to: As the measurement approaches the focal plane the distortion of the wavefront becomes more important, and needs to be incorpoarated (van Dam and Lane)

43 Non-linearity due to the wavefront changing As a consequence the intensity also changes! So, to second order : The sensor is non-linear!

44 Origin of terms Due to the difference in the curvature in the x- and y- directions (astigmatism). Due to the local wave-front slope, displacing the curvature measurement.

45 Consequences of the analysis As z increases, the curvature sensor is limited by nonlinearities K and T. A third-order diffraction term limits the spatial resolution to

46 Analysis of the curvature sensor As the propagation distance, z, increases, Sensitivity increases. Spatial resolution decreases. The relationship between the signal and the curvature becomes non-linear.

47 Tradeoff in the curvature sensor Fundamental conflict between: Sensitivity which dictates moving the detection planes toward the focal plane Aperture resolution which dictates that the planes should be closer to the aperture

48 Slopes in the wave-front causes the intensity distribution to be stretched like a rubber sheet Wavefront sensing maps the distribution backto uniform Geometric optics model z W(x) xx

49 The intensity can be viewed as a probability density function (PDF) for photon arrival. As the wave propagates, the PDF evolves. The cumulative distribution function (CDF) also changes. Intensity distribution as a PDF

50 Take two propagated images of the aperture. D=1 m, r 0 =0.1 m and λ=589 nm. Intensity at - z Intensity at z

51 Can prove using the irradiance transport equation and the wave-front transport equation that This relationship is exact for geometric optics, even when there is scintillation. Can be thought of as the light intensity being a rubber sheet being stretched unevenly

52 Use the cumulative distribution function to match points in the two intensity distributions. The slope is given by x1 x2

53 Results in one dimension Actual (black) and reconstructed (red) derivative

54 Simulation results

55 Comparison with Shack-Hartmann

56 Conclusions Fundamentally geometric wavefront sensors are all based on the same linear relationship between slope and displaced light All sensors trade off the number of modes you can estimate against the quality of the estimate The main difference between the curvature and Shack-Hartmann is how they divide the aperture Question is how to make this tradeoff optimally.


Download ppt "Wavefront Sensing I Richard Lane Department of Electrical and Computer Engineering University of Canterbury Christchurch New Zealand."

Similar presentations


Ads by Google