Author Topic: Processing Example: The Moon in NarrowBand using StarAlign & Drizzle  (Read 18152 times)

Offline MikeOates

  • PixInsight Addict
  • ***
  • Posts: 278
First I will explain why and how I took the images in the first place, then go on to the processing. This is a first for me in a few aspects:

  1. First CCD image of the Moon.
  2. First time using StarAlignment to align the Moon, no stars visible.
  3. First time using Narrowband filters on the Moon.

Some time ago I tried to image the Moon with my scope and CCD camera (Takahashi FSQ-106ED and SX-Trius 814 mono), but it was way over exposed as the CCD camera is too sensitive even with a 1/1000th sec exposure. So I ordered a Neutral Density (ND) filter ( 0.8 )

Unfortunately, I guessed at the wrong density and images were still over exposed. So I tried imaging through both the ND and narrowband filters, Ha and NIR filters were over exposed but the OIII and SII were perfect.

So I made an image using those two filters with the colours mapped as follows:

Red = SII
Green = (SII *0.5) + (OIII *0.5)
Blue = OIII

Note: SII is in the red part of the spectrum and OIII is blue, so they do produce the correct colours. I took 15 subs of each filter, aligned and processed in PixInsight and got the attached image.

The workflow:

Firstly, this is based on a post here by Ignacio, (http://pixinsight.com/forum/index.php?topic=5908.15) without which I would not know where to start, so thank you Ignacio.

I am showing full size crops of a small part of the image as we go through the tutorial, so you see the changes at each stage.

As I was only experimenting, I did not take any flats, and next time I will, but I did have bias frames, so the first task was to calibrate the subs with the bias frames. This I did with ImageCalibration, but you can use the BatchPreprocessing script if that's what your used to.

Next all the subs need to be aligned. Now I am not saying the settings here are the right ones, I certainly have not experimented, as the first values I tried worked for me and are the same as Ignacio used. All values are at default except those those marked in yellow. Plus of course you need to select a Reference image, the images to be aligned and the output directory. In this example I am also using drizzle to get a final image larger than taken with the camera. As my camera field of view is 1.5 degrees across, the Moon (0.5 deg) is rather small in the frame. And I have to say that the increase in detail is amazing, drizzle really does work.

First StarAlignment, yes I know there are no stars, but it does work, you just need to make some changes to the settings as shown.


Each set of subs, (one for OIII and one for SII) need to be integrated (stacked). I used the same settings as I use for deep sky images as shown in the tutorial by Vicent Peris [http://www.pixinsight.com/tutorials/master-frames/index.html] Again for this example 'Generate drizzle data' is checked. The Sigma High and Low are set lower than usual for deep sky images.


As drizzle is being used, DrizzleIntegration is performed next.


After Integration we end up with two images, the small one being a standard integration and the larger one is the drizzled version. The image below shows both right after integration and for some reason they have a diferent brightness. Note this is before they have had a ScreenTransferFunction (STF) applied.


The images are now cropped with DynamicCrop to remove some of the black sky,using the same settings so they stay aligned.

So here is the starting point, the registered and integrated OIII image. Remember, these are crops of a small part of the image at full resolution.


I then performed Deconvolution on each integrated set. If not drizzled I used 0.5 using the Parametric PSF with no deringing and just one pass. But the drizzled image I used a StdDev of 1.5 and I also enabled Deringing which gave a better result, with Global dark at 0.05 and Global bright at 0.03. A second pass with deconvolution with the same settings except changing the StdDev to 0.9 this second pass enhanced the fine detail.


Here is the deconvolved image: (First Pass)


And the deconvolved image: (Second Pass)


I used LinearFit on the two images (OIII & SII) so that when they are combined to produce an RGB image the colour balance should be correct.


The conversion to RGB was done in PixelMath, a synthetic green channel is made from 50% of both OIII and SII.


We now have a colour image of the Moon, but at this stage you will hardly see any colour if at all. To ensure the colour balance is right, a preview of the black sky is made and BackgroundNeutralization is used. No further colour calibration should be needed.


The image is then made non-linear with HistogramTransformation to get the brightness as you want it. Further multiple curves transformations are done to adjust contrast and colour saturation. I deliberately did not over do the saturation, it's all too easy to go to far and produce an over saturated Moon that looks false.


I extracted the luminance and did further work on the L image to enhance the sharpening. UnsharpMask, HDRMultiscaleTransform, more CurvesTransformations for contrast and a final UnsharpMask.


Luminance before further processing:


Luminance after further processing, now sharper and more contrasty:


This Luminance was then added back to the RGB image using LRGBCombination


Here is the LRGB image:


We are nearly there, a few more fine adjustments with LocalHistogramEqualization, Curves and finally ACDNR to reduce noise, most of which is generated with all the previous operations, most of the noise reduction being applied to the chrominance.




Here is the final image at a reduced size:


Visit Astrobin for the full size image.
http://astrob.in/102018/0/

I am learning all the time and I suspect the LRGB combination could have been done better, all I did was extract the L and add it back after processing, but I think at the time of extraction the RGB should also be extracted and later on, the separate L, R, G & B channels are combined.

I also missed out cosmetic correction before Integration!

Anyway, I hope this will help someone.

Mike
« Last Edit: 2014 June 16 09:57:50 by MikeOates »

Offline Alejandro Tombolini

  • PTeam Member
  • PixInsight Jedi
  • *****
  • Posts: 1267
    • Próxima Sur
Thank you Mike, it is an excellent description of the process and I like very much the moon you achieved

Saludos, Alejandro

Offline Rod771

  • Newcomer
  • Posts: 22
Incredible result.

Well Done Mike, thanks for sharing  :)

Offline MikeOates

  • PixInsight Addict
  • ***
  • Posts: 278
Alejandro & Rod,

Thank you both very much.

Mike

Offline sctall

  • PixInsight Enthusiast
  • **
  • Posts: 88
  • scott
Amazing Mike
Looks like a lot of thinking went into this.

Impressed you can get this result from a mono CCD.

Scott T.
ES102, WO GT81, astronomics, guide scope  CEM60
ASI120MC, ASI224MC, ASI178MM
Lunt60 SS,  moonlight focuser
LX200GPS

Offline CharlesW

  • PixInsight Enthusiast
  • **
  • Posts: 87
This was very helpful and I really like your style in presenting the tutorial. Nice moon, too!

Offline Ignacio

  • PixInsight Old Hand
  • ****
  • Posts: 375
    • PampaSkies
Well done, Mike, a very clear tutorial.

Ignacio

Offline Juan Conejero

  • PTeam Member
  • PixInsight Jedi Grand Master
  • ********
  • Posts: 7111
    • http://pixinsight.com/
Nice work Mike. This encourages me to work on a derivative of the StarAlignment tool for nonstellar alignment features. When I designed and implemented the arbitrary distortion correction algorithm I didn't expect such a wide range of applications. Also the use of drizzle is interesting here because it allows you to model a much more accurate PSF. Well done ;)
Juan Conejero
PixInsight Development Team
http://pixinsight.com/

Offline MikeOates

  • PixInsight Addict
  • ***
  • Posts: 278
A big thank you to all those who have commented on this 'tutorial', that gives me the confidence and motivation to do more.

I have taken some more lunar images a couple of days ago, this time I did the flats and took 30 subs with each filter. The conditions were very different, it was almost daylight with a bright sky. So when I used the settings in the above example, they needed tweaking to get the same result, i.e. only one pass of deconvolution and different sharping methods for the luminance.

I think this is an important point to put across, any given tutorial or processing example should only be used as a starting point from where changes can be made if you don't get the result your after. Each set of subs from different people, using different equipment and observing conditions will mean no one set of settings or sequence of tools will suit all.

Nice work Mike. This encourages me to work on a derivative of the StarAlignment tool for nonstellar alignment features. When I designed and implemented the arbitrary distortion correction algorithm I didn't expect such a wide range of applications. Also the use of drizzle is interesting here because it allows you to model a much more accurate PSF. Well done ;)

Juan, I am very glad you liked the result and that it will encourage you to improve on StarAlignment tool for nonstellar alignment, that's very good news :) Does this mean you may look at the local distortions that occur due to seeing changes? If you look at a set of subs with the Blink tool, you can see various parts of the lunar surface moving about, or is that already included in the arbitrary distortion correction algorithm?

Thanks,

Mike

Offline MikeOates

  • PixInsight Addict
  • ***
  • Posts: 278
I thought I would follow up with a cropped example of one image with drizzle and the other without.

I don't think you need me to tell you which is which  ;)

In order to show them side by side, I used Resample at 200% on the non drizzled image. Needless to say, I shall be using drizzle as part of my normal processing workflow.

Mike

Offline Juan Conejero

  • PTeam Member
  • PixInsight Jedi Grand Master
  • ********
  • Posts: 7111
    • http://pixinsight.com/
Quote
If you look at a set of subs with the Blink tool, you can see various parts of the lunar surface moving about, or is that already included in the arbitrary distortion correction algorithm?

The algorithm does not know what kind of distortions are being modelled and corrected. So yes, these local distortions should already being corrected by the current implementation. The only problem is that SA is looking for stars as alignment references, which is suboptimal for lunar and planetary images.
Juan Conejero
PixInsight Development Team
http://pixinsight.com/

Offline Bart_van_der_Wolf

  • Newcomer
  • Posts: 10
This encourages me to work on a derivative of the StarAlignment tool for nonstellar alignment features.

Yes please!

Cheers,
Bart

Offline astroedo

  • PixInsight Addict
  • ***
  • Posts: 171
  • Io ne ho viste... cose che voi umani...
    • L'arciere celeste
Really a great processing example!
I was about to try exactly the same thing with my SBIG ST2000 XM but using Halpha, OIII and Hbeta filters for RGB.
I will use your tutorial as a guideline.

Offline Torsinadoc

  • PixInsight Enthusiast
  • **
  • Posts: 98
I modified it for DSLR.  I like the workflow. Thanks

http://astrob.in/full/109107/0/

Offline MikeOates

  • PixInsight Addict
  • ***
  • Posts: 278
astroedo & Torsinadoc,

Thank you, I am glad you like the process.

Mike