580 likes | 789 Vues
Wavefront Sensing I. Richard Lane Department of Electrical and Computer Engineering University of Canterbury Christchurch New Zealand. Location. Astronomical Imaging Group past and present. Dr Richard Lane Professor Peter Gough Associate Professor P. J. Bones
E N D
Wavefront Sensing I Richard Lane Department of Electrical and Computer Engineering University of Canterbury Christchurch New Zealand
Astronomical Imaging Grouppast and present Dr Richard Lane Professor Peter Gough Associate Professor P. J. Bones Associate Professor Peter Cottrell Professor Richard Bates Dr Bonnie Law Dr Roy Irwan Dr Rachel Johnston Dr Marcos van Dam Dr Valerie Leung Richard Clare Yong Chew Judy Mohr
Contents • Session 1 – Principles • Session 2 – Performances • Session 3 – Wavefront Reconstruction for 3D
Principles of wavefront sensing • Introduction • Closed against open loop wavefront sensing • Nonlinear wavefront sensing • Shack-Hartmann • Curvature • Geometric • Conclusions
Distorted incoming wavefront telescope Image plane Deformable mirror Wavefront sensor Adaptive Optics system
Closed loop system Reduces the effects of disturbances such as telescope vibration, modelling errors by the loop gain Design limited by stability constraints Does not inherently improve the noise performance unless the closed loop measurements are easier to make
Postprocessing systemfeedforward compensation Fixed mirror Distorted incoming wavefront Detector plane telescope Computer Image Wavefront sensor
Open loop system (SPID) Sensitive to modelling errors No stability issues with computer post processing Problem is not noise but errors in modelling the system T time Temporal coherence of the atmosphere
Modelling the problem (step1) • The relationship between the measured data and the object and the point spread function is linear Data object convolution point spread function noise (psf) • A linear relationship would mean that if we multiply the input by α we multiply the output by α. The output doesn’t change form
Modelling the problem (step 2) • The relationship between the phase and the psf is non linear psf Fourier magnitude phase correlation transform
Phase retrieval • Nonlinearity caused by 2p wrapping interacting with smoothing Wrapped ambiguity Correct MAP estimate ML estimation MAP estimation
Role of typical wavefront sensor • To produce a linear relationship between the measurements and the phase • Speeds up reconstruction • Guarantees a solution • Degrades the ultimate performance phase weighting basis function
Solution is by linear equations Measurement Interaction Basis function vector matrix Coefficents • ith column of Θ corresponds to the measurement that would occur if the phase was the ith basis function • Three main issues • What has been lost in linearising? • How well you can solve the system of equations? • Is it the right equations?
The effect of turbulence There is a linear relationship between the mean slope of the phase in a direction and the displacement of the image in that direction.
Trivial example • There is a linear relationship between the mean slope and the displacement of the centroid • Measurements are the centroids of the data • Interaction matrix is the scaled identity • Reconstruct the coefficients of the tip and tilt
Quality of the reconstruction • The centroid proportional to the mean slope (Primot el al, Welsh et al). • The best Strehl requires estimating the least mean square (LMS) phase (Glindemann). • To distinguish the mean and LMS slope you need to estimate the coma and higher order terms LMS slope Mean slope Phase
Difference between the lms and mean tilt Ideal image Coma distortion • Peak value is better than the centroid for optimising the Strehl • Impractical for low light data Detected image
Where to from here • The real problem is how to estimate higher aberration orders. • Wavefront sensor can be divided into: • pupil plane techniques, that measure slopes (curvatures) in the divided pupil plane, • Shack-Hartmann • Curvature (Roddier), Pyramid (Ragazonni) • Lateral Shearing Interferometers • Image plane techniques that go directly from data in the image plane to the phase (nonlinear) • Phase diversity (Paxman) • Phase retrieval
Geometric wavefront sensing • Pyramid, Shack-Hartmann and Curvature sensors are all essentially geometric wavefront sensors • Rely on the fact that light propagates perpindicularly to the wavefront. • A linear relationship between the displacement and the slope • Essentially achromatic
W(x) z x Geometric optics model • A slope in the wave-front causes an incoming photon to be displaced by • Model is independent of wavelength and spatial coherence.
Generalized wave-front sensor • This is the basis of the two most common wave-front sensors. Converging lens Aberration Focal plane Shack-Hartmann Curvature sensor
Trade-off • For fixed photon count, you trade off the number of modes you can estimate in the phase screen against the accuracy with which you can estimate them • To estimate a high number of modes you need good resolution in the pupil plane • To make the estimate accurately you need good resolution in the image plane
Properties of a wave-front sensor • Linearization: want a linear relationship between the wave-front and the measurements. • Localization: the measurements must relate to a region of the aperture. • Broadband: the sensor should operate over a wide range of wavelengths. Geometric Optics regime
Explicit division of the pupil Direct image Shack-Hartmann
Shack-Hartmann sensor • Subdivide the aperture and converge each subdivision to a different point on the focal plane. • A wave-front slope, Wx, causes a displacement of each image by zWx.
Fundamental problem • Resolution in the pupil plane is inversely proportional to the resolution in the image plane • You can have good resolution in one but not both (Uncertainty principle) Pupil D w Image
Loss of information due to subdivision • Cannot measure the average phase difference between the apertures • Can only determine the mean phase slope within an aperture • As the apertures become smaller the light per aperture drops • As the aperture size drops below r0 (Fried parameter) the spot centroid becomes harder to measure
Implicit subdivision • If you don’t image in the focal plane then the image looks like a blurred version of the aperture • If it looks like the aperture then you can localise in the aperture
Explanation of the underlying principle • If there is a deviation from the average curvature in the wavefront then on one side the image will be brighter than the other If there is no curvature from the atmosphere then it is equally bright on both sides of focus.
Slope based analysis of the curvature sensor The displacement of light from one pixel to its neighbour is s determined by the slope of the wavefront
Slope based analysis of the curvature sensor • The signal is the difference between two slope signals • →Curvature
Phase information localisation in the curvature sensor • Diffraction blurring + geometric expansion
Curvature sensing • Localization comes from the short effective propagation distance, • Linear relationship between the curvature in the aperture and the normalized intensity difference: • Broadband light helps reduce diffraction effects.
Curvature sensing signal • The intensity signal gives an approximate estimate of the curvature. • Two planes help remove scintillation effects Simulated intensity measurement Curvature sensing estimate
Irradiance transport equation • Linear approximation gives
Solution inside the boundary • There is a linear relationship between the signal and the curvature. • The sensor is more sensitive for large effective propagation distances.
Solution at the boundary (mean slope) • If the intensity is constant at the aperture, H(z) = Heaviside function I1 I2 I1-I2
The wavefront also changes • As the wave propagates, the wave-front changes according to: • As the measurement approaches the focal plane the distortion of the wavefront becomes more important, and needs to be incorpoarated (van Dam and Lane)
Non-linearity due to the wavefront changing • As a consequence the intensity also changes! • So, to second order : • The sensor is non-linear!
Origin of terms Due to the difference in the curvature in the x- and y- directions (astigmatism). Due to the local wave-front slope, displacing the curvature measurement.
Consequences of the analysis • As z increases, the curvature sensor is limited by nonlinearities K and T. • A third-order diffraction term limits the spatial resolution to
Analysis of the curvature sensor As the propagation distance, z, increases, • Sensitivity increases. • Spatial resolution decreases. • The relationship between the signal and the curvature becomes non-linear.
Fundamental conflict between: Sensitivity which dictates moving the detection planes toward the focal plane Aperture resolution which dictates that the planes should be closer to the aperture Tradeoff in the curvature sensor
W(x) z x Geometric optics model • Slopes in the wave-front causes the intensity distribution to be stretched like a rubber sheet • Wavefront sensing maps the distribution backto uniform
Intensity distribution as a PDF • The intensity can be viewed as a probability density function (PDF) for photon arrival. • As the wave propagates, the PDF evolves. • The cumulative distribution function (CDF) also changes.
Take two propagated images of the aperture. D=1 m, r0=0.1 m and λ=589 nm. Intensity at -z Intensity at z