Author Topic: Phase Retrieval?  (Read 2764 times)

Offline dmcclain

  • PixInsight Addict
  • ***
  • Posts: 117
Phase Retrieval?
« on: 2016 January 27 12:03:16 »
I have been trying to remove a focal plane tilt and resulting defocus that varies across the image. Using DynamicPSF and Deconvolution is extremely effective at isolating the point sources, but the results remind me of the guy who asked the genie for a million bucks, only to find a million deer with antlers in his front yard.

Back in the early '90s after Hubble was first launched, and they noticed the problems with the spherical aberration, we did a lot of something called phase retrieval. I can't tell you the details, other than it was an iterative method of reconstructing the wavefront from an intensity image, and then using Zernike polynomials, making corrections to remove the spherical aberration.

Is there anything already implemented in PI that would accomplish this sort of thing? I know that in the late 90's the ideas about wavelet decomposition and multiresolution support broke onto the scene. Perhaps there are wavelet formulations of the phase retrieval idea?

Offline dmcclain

  • PixInsight Addict
  • ***
  • Posts: 117
Re: Phase Retrieval?
« Reply #1 on: 2016 January 28 15:17:38 »
This paper,

http://arxiv.org/pdf/1304.7337.pdf

demonstrates the use of phase retrieval to map out spatially varying aberrations over an image plane in wide field microscopy. They then show that the information gained can be used to convolve the images to produce a improved aberration free image. They use a calibration target. Seems to me, we have hundreds, and possibly thousands, of calibration targets in every frame.

When I left the military-industrial empire about 10 years ago, they were knocking themselves out trying to build micro-precision calibration tables and targets so that they could "accurately" measure the aberrations of their IR sensors in space-borne EKV platforms. I told them at the time they should be considering randomness their friend, but they persisted for at least another 15 years before concluding the same thing... just like we knock ourselves out over focus and sensor tilt and collimation.

Seems to me, we get the data that we get. And we ought to be able to extract much more from it, despite the presence of realistic and inevitable small amounts of sensor tilt, defocus, coma, and chromatic aberration. I don't think we have yet scratched the surface on what can be done.

DynamicPSF and Deconvolution are a start. By now, phase retrieval seems to have become a mainstay of X-ray crystallography, syntethic aperture radio astronomy, and electron microscopy. How has astronomical imaging been left behind?
 
« Last Edit: 2016 January 28 15:56:19 by dmcclain »

Offline msmythers

  • PTeam Member
  • PixInsight Jedi
  • *****
  • Posts: 1178
    • astrobin
Re: Phase Retrieval?
« Reply #2 on: 2016 January 28 16:16:13 »
PI has a Fourier Transform tool.


Mike

Offline dmcclain

  • PixInsight Addict
  • ***
  • Posts: 117
Re: Phase Retrieval?
« Reply #3 on: 2016 January 29 15:24:29 »
Well, that certainly is a long pole... but what the heck...

I tried doing the FFT of an image that looks like it could use a little help in the focus. Then by trial and error arrived at an PixelMath expression that could be used to modify the phase mask, and subtracted a 1/5 wave of defocus, for now just centered in the aperture plane, then inverse FFT to a reconstructed image.

Believe it or not, when I perform the FWHMEccentricity on the lightness images, the focus (FWHM) really did improve in the center of the image by about 10-15%, and the eccentricity rose about 20%, as you could expect. Focus (FWHM) became more uniform across the image.

That small improvement, only because I was trying to keep the holes in the image to a minimum. (see next paragraph) I think focus could be improved even more.

But there was clear evidence of the need for a positivity constraint, since holes were being punched in my image, and the image developed fixed-pattern noise to start looking like an outside doormat. Amusing first cut at hand-cranked phase retrieval. But I think I need a side program to automate things. Not sure how I would implement a positivity constraint in PI, since it keeps images in the interval (0, 1). And from what I know about phase retrieval, we need on the order of 160-1000 iterations.

Offline dmcclain

  • PixInsight Addict
  • ***
  • Posts: 117
Re: Phase Retrieval?
« Reply #4 on: 2016 January 29 16:06:50 »
another hour spent trial and error with defocus amount vs FWHM maps. Looks like the best is around 0.01 wave of defocus, for a 30% improvement in FWHM across the image.

There are still holes in the image from non-positivity, but the image looks pretty decent otherwise. I think this shows a glimmer of hope for the method.

[ actually... thinking back those 10 years, I guess I already wrote a phase retrieval of sorts for the M-I Complex. It was called BlurFit, originally written in RSI/IDL, then later translated to OCaml and then NML (my own math language patterned off of OCaml). The NML version worked stupendously well for the next 10 years.

The idea was that we moved a "point" target (at 5 microns wavelength), on an X-Y stage, by fractions of a pixel width. We would grab images of that target in the multiple pixel "phasings", and from those images derive a series of Zernike coefficients to describe the optical train. I guess, upon reflection, this is phase retrieval. I should dig that code back out and see what can be done with star fields and random placements of star images as we have.

I wrote a paper at the urging of Phil Wadler, for the ACM. You can read all about it here: http://thirdworld.nl/blurfit ]