Author Topic: Holiday Gift: My PSF extraction method explained !!  (Read 5385 times)

Offline darkownt

  • PixInsight Enthusiast
  • **
  • Posts: 92
Holiday Gift: My PSF extraction method explained !!
« on: 2010 December 29 12:02:44 »
Hello All:

I have been meaning to do some sort of tutorial to share with everyone but I have been unable to find the time to really do it justice.  The techniques here are best used for PSFs resulting from DSLR optics and best performed in a terrestrial setting, although I have every beleif that it could be adapted to an astronomical setting if an appropriate target and true image source could be found (see below).

My PSF extraction techniques are used to extract the actual PSF of a real optical system for use in correcting a series of any images taken at the same optical settings, (zoom, F-stop etc.).  Since it uses no simulations of optics I refer to it as a measurement/calibration tool.  Measure the spread under controled conditions to profile your lens etc. ... make a PSF for use in correcting further pictures taken with those optics.  The main limits to the accuracy are the algorithms (particularly restoration) themselves and digital noise.

Anyhow, I wrote to Juan some time ago, and he encouraged me to tell you all about it!  So here is the message I sent to Juan.

Enjoy playing ... if you have any questions or want any tips just ask me.  I'm sure one of you who like to make comments rather than ask anything will also chime in as usual.  ;) :P

Happy happy holidays!!!!


CHEERS!
COLIN

Hi Juan:

I wish I had time to present my PSF extraction techniques with the forum, but I have not had the time to put something together.

This may be of interest to you since it is done entirely with PI, and although it is in some ways it uses "work arounds" they may be improved with specific code/tools.

I have been using my method for measuring PSFs of my DSLR lenses, at specific zoom and aperture settings.  As such it really is a calibration - restoration tool.  Images from non-local sources can't be used (with a few exceptions described below).  Once the lens at specific zoom and aperture are profiled I use the PSF with the PI tools for deconvolution and restoration.  The extracted PSFs have allowed me to deconvolve better than any other method I have tried before.

The Method:

This is simple stuff for you so I will try to be brief (unsuccessfully!).

The method I use relies on the relationship between a PSF as a convolution kernal (K), a true image (T), and the blurry image (B) resulting from the spread of T.

B = K*T = T*K  (where * denotes convolution)

We usually have B but K is usually unknown so we usually do our best to find a best guess Kg ? K.  i.e. Try B/Kg = Trecovered and see how good Trecovered looks.  (For lack of a symbol A/C means A deconvolved by C)

With a DSLR it's relatively easy to take pictures of something in a very controlled setting.  Which gives us the opportunity to take pictures of a subject for which we already have T or if T is quite easy to obtain.

Having T we can use B/T = K to find the PSF.


How I use PixInsight to do this

Step 1:  Generate or find T.   

a) generate T:  I have found T optimally is a synthetic black and white image (no gray, works best since we have clear white and black points to stretch images with, and all resulting grey values in the actual photo will be due to optical spreading) with multiple sized features, and boundaries having all sorts of directions (no directional preference for edges or shapes in the image), the boundary of the synthetic image should be framed with black to allow room for spreading of the light from the white objects of the interior outward.

Once T is generated in software, have a hardcopy created, printed in black and white at a high resolution.

b) find T:  a suitable T such as that described above may be an existing 2D image or other 2D object which can be scanned or photographed.  The scanning process should be carried out on a black background if the image or object does not already have a black frame.

If the 2D image or 2D object is photographed, this should be done at a distance such that perspective effects can be corrected (some applications can take an image and correct for spherical bulging of perspective projection to provide an image which represents a 2D projection). 

Given that spreading occurs for scanned and photographed images, the last step to generate T is to perform a resizing to shrink it to a point that no spreading is left in the final image T, this does leave some grey pixels at the boundaries but it should be OK.


Step 2:  Take a picture of printed T or 2D image or object with the optical system to be measured.  Ensure that the resolution of this photo (B) no greater than the resolution of the image  T used, and I mean resolution of the target details etc.


Step 3:  Obtain registered version of T with B as reference

a) convert sample format of both T and B to 32bit grey scale

b) Crop and scale T down to match B as close as possible (in order to retain proper size of PSF (K) we do not scale B), possibly rotate T manually as close as possible as well.

c) intensity rescale both B and T (for widest range, and matching)

d) Use ImageRegistration to align T to B.  In order to retain orientation of the PSF (K) we do not rotate or scale B.  This step can be a bit tricky.  I have used faint star settings, less peakedness, and no scaling.  Almost always, if T is close enough in size and perhaps orientation to B I can find settings in imageregistration which aligns T to B.  OPPORTUNITY: A tool that works well with non-stellar images i.e. can align any two images would be useful.  Also, having a blurring kernal or filter applied to T before determining scale and rotation parameters, may be useful, although the final image will be a registered version of unfiltered T.


Step 4:  Deconvolve B with T to get K

This also can get tricky, and I will explain the process although I do not know why it occurs this way.

Using the restorationfilter on B (or preview), set both dynamic range extension ranges to 1.  Set the filter to least constrained (although Weiner works), and the estimated error as low as possible (1x10-15).  Use T as the external PSF.  If we try these values a great mess of white often appears at the center.  IF we set the amount to 0.01 however, the beginnings of a PSF emerges.   One will note however that the PSF (K) is not accurate, as it is overexposed.  OPPORTUNITY:  A tool that assumes even less error than (1x10-15) or is more accurate or has greater dynamic range extension than 1.0 <- this may be the key.  I assume that restoration is intensity conserving hence all of the intensity of the image would need to be shuffled to the small central point, which could require dynamic range extension in thousands to millions (just a guess). 
The fix for this is a workaround.  Use Pixelmath and scale original B by 0.001 or 0.0001 (some experimentation is required due to size of B, how much white is in it etc).  Try Restoration filter again, if PSF is not saturated try bumping Amount up from 0.01 (depending on how dark it was made in previous step I have used amount values as high as 50) to a value where it is just below intensity saturation (brightest pixel < 1).  Undo-Redo Pixel math to get the right amount such that a non saturated PSF is generated.  This Fix has a catch 22, low Amounts leave more trace of B in the background, while lower scaling of B can run into rounding errors.  OPPORTUNITY:  A tool with a greater dynamic range extension and more precision.

Crop the resulting PSF.

Step 5:  Clean up the PSF

For any number of factors, there is usually noise in the generated PSF which can be seen in the background.  I have not been able to figure out a way to deal with any noise in the PSF itself (its hard to know what is noise and what is not), but I use a threshold type Pixelmath to blacken out values below a certain threshold such that the noise is outside the PSF is removed.

If we could somehow improve the accuracy and/or range of the restoration perhaps there will be less noise (especially resulting from synthetic tests).

The PSF emerging from this process is often too broad.  I chalk it up to all the rescaling of T and rounding errors of the process although in synthetic tests without use of registration (Synthetic B is simply T with convolved with a small kernal) the PSF is still not perfect.  Narrow the PSF using deconvolution by an averaging kernal.  (I usually skip this step if I know the PSF if irregular like the acronym "PI" I used once used in a synthetic test. :)

OPPORTUNITY: a Tool that does all of this automatically from T and B!!

The PSF's I have extracted using this method are the best I have been able to generate or simulate for use in deconvolving or restoring real world pictures taken with my DSLR.  I know the method has its limits since my synthetic tests have always revealed a small difference between the actual PSF and the extracted PSF even when B was generated using the convolution function in PJSR of T and K.  i.e. the extracted PSF is not exactly (although close to) K.

Exception:  I stated above that this works only when you can generate or find T "locally".  This would imply astronomical pictures are not useful for calibration, how then could you calibrate a telescope??  Star fields can be generated synthetically, and with correct brightness.  Generated as point sources in a synthetic T, perhaps this could be used with a properly photographed (linear intensity no intensity saturation) starfield.  If I owned a telescope I would be doing this sort of thing and perfecting the process… well right now!  So for hobbyists/backyard astronomers who don’t have the undoubtedly far more superior methods as professional observatories, my method with a reference starfield could be used to help them calibrate/measure their optical system.

Other ideas I have had for finding a suitable T deals with thresholding an already blurred black and white image.  If the radius of curvature of the white and black features in the image are much greater than the size of the PSF, perhaps setting a proper threshold level could generate a sharp black and white image which could be used as T.  Synthetic experimentation seems to show that this may be possible although the subjects would still likely have to be terrestrial, like a speed limit sign or billboard sign.

Please let me know if you have any thoughts, questions, would like to discuss or if you would like me show you examples etc.  I think there are opportunities here for a useful little tool!

PS:   I've also used a variant of this to measure an INK spread function (ISF??) of my inkjet printer… yes ink droplets spread a little in paper … and believe it or not oversharpening an image (something I never normally do) using this ISF (i.e. deconvolve before printing) results in a better printed image…(having mostly no oversharpening, yet being sharper than what the printer would produce with a normal image)!!!

Cheers
Colin


Offline Simon Hicks

  • PixInsight Old Hand
  • ****
  • Posts: 333
Re: Holiday Gift: My PSF extraction method explained !!
« Reply #1 on: 2010 December 29 12:40:05 »
Hi Colin,

Happy Holiday's to you too! This sounds like an extremely good tool if it can be created.

I'm not sure I've quite got my head around all the details here, but I have a few simple questions.

Firstly, I had assumed that an unsaturated star is effectively a point source, and therefore if I capture this star (unsaturated) through my optics then isn't that as good as the synthetic B&W image? If not, why not?

Secondly, in most (all?) lenses/scopes there are distortions across the field of view. So the stars in the corners of the image will not be the lovely tight circles they are in the centre. It seems to me that it would be great to have a tool to map the PSF as a function of position in the image. Then a deconvolution with this PSF(x,y) would correct the whole image optimally. This could maybe allow use of faster scopes or allow us to open up lenses to their full aperture.

I hope someone can help you get this scripted, it sounds great.

Cheers
         Simon

Offline darkownt

  • PixInsight Enthusiast
  • **
  • Posts: 92
Re: Holiday Gift: My PSF extraction method explained !!
« Reply #2 on: 2010 December 30 19:32:44 »
Hi Colin,

Happy Holiday's to you too! This sounds like an extremely good tool if it can be created.

I'm not sure I've quite got my head around all the details here, but I have a few simple questions.

Firstly, I had assumed that an unsaturated star is effectively a point source, and therefore if I capture this star (unsaturated) through my optics then isn't that as good as the synthetic B&W image? If not, why not?

Secondly, in most (all?) lenses/scopes there are distortions across the field of view. So the stars in the corners of the image will not be the lovely tight circles they are in the centre. It seems to me that it would be great to have a tool to map the PSF as a function of position in the image. Then a deconvolution with this PSF(x,y) would correct the whole image optimally. This could maybe allow use of faster scopes or allow us to open up lenses to their full aperture.

I hope someone can help you get this scripted, it sounds great.

Cheers
         Simon

Hi Simon!

If you can get an isolated star centered perfectly on a pixel you would have a great PSF.  Two difficulties arise in reality: 1.  Intensity pollution: Other stars/objects, even very faint ones, in the vicinity of the chosen point source will be false contributions to the PSF.  Even if NO other sources were in the field of view for your point source (probably nor possible) point spreading from sources outside the view can contribute (erroneously) to the PSF.  2.  Binning problem.  You need to ensure the source is centered on a pixel.  Imagine if PSF was as a square function the size of a single pixel, measureing it properly would mean the source would have to be centered (the ONLY alignment which would appear as a single pixel of full intensity), otherwise a two or four pixel PSF of lower intenisty might be measured which is clearly too wide and incorrect,  taking multiple photos to get it centered should not however be a problem...  I think intensity pollution is the biggest problem.

Right now PixInsight doesnt perform non-homogeneous deconvolution or restoration.  One kernal is applied to correct the entire image.  If a facility for varying PSFs were developed, my method could be used with some additional work which would depend on the particular facilify provided in PI.

With the current tools in PI the method requires a fair degree of human judgment and tweaking.  Scripting I think would require creation of some new tools, like terrestrial image registration, and greater dynamic range extension for the restoration tool.


cheers
Colin

Offline Simon Hicks

  • PixInsight Old Hand
  • ****
  • Posts: 333
Re: Holiday Gift: My PSF extraction method explained !!
« Reply #3 on: 2010 December 31 11:19:43 »
Hi Colin,

Have you tried Carlos Milovic's ReadPSF process?

http://pixinsight.com/forum/index.php?topic=2675.0

It takes account of the background around the star...well you tell it the background level  ;) . And it works out the central position of the star to subpixel accuracy.

I have tried it and it seems to give good results. You plug the results into the Deconvolution module and it seems to give nice round stars. Maybe worth looking at.

Cheers
         Simon

Offline darkownt

  • PixInsight Enthusiast
  • **
  • Posts: 92
Re: Holiday Gift: My PSF extraction method explained !!
« Reply #4 on: 2011 January 02 10:28:06 »
Hi Simon:

Yes I've seen Mr. Molivic's tool. It works great for those kind of PSFs the restoration and deconvolution tools can generate (family of gaussian like functions having different degrees of kurtosis).  Certainly if you know your PSF falls within this family of curves, finding the defining parameters to approximate the PSF is all you need to do.

I developed my method specifically for PSFs which do not fall within that family of mathematical curves, and is really directed to extracting actual PSFs whatever thier functional form (whether or not thay even have a form which could be defined in functional terms e.g. a PSF in the shape of the letters "PI"). 

Try to think of my method as a completely different strategy in your box of tools.  If you know the general functional form, find the parameters using other tools to approximate the PSF, if not, or if the approximate results are not good enough, use this method to extract the actual PSF and avoid any need for parametrization and approximation or mathematical simulation. 

I have had great success with my method even in cases with non symmetric and irregular PSFs (no 2d line of symmetry... for example a PSF in the shape of the letter "P" in some synthetic tests), as well as PSFs which are more like bokeh with a central local minimum rather than a central local maximum (donught-like PSFs).  Images having PSFs of this kind cannot be deconvolved or restored properly using the standard generated "parametric" PSFs.  Proper PSF images need to extracted from the image and provided to the deconvolution and restoration tools.

Let me know if you have any questions and I'll be glad to answer them!

Also I encourage you and anyone to test the limits of the method with some non-standard PSFs.  You'll be surprised how well it works even in the most challenging of situations.

cheers
Colin 

Offline darkownt

  • PixInsight Enthusiast
  • **
  • Posts: 92
Re: Holiday Gift: My PSF extraction method explained !!
« Reply #5 on: 2011 January 05 11:21:12 »
Just curious.  Has anyone tried this yet?

:)

Offline darkownt

  • PixInsight Enthusiast
  • **
  • Posts: 92
Re: Holiday Gift: My PSF extraction method explained !!
« Reply #6 on: 2011 May 18 09:15:42 »
Has anyone tried this yet?   :D