NormalizeScaleGradient: Bookmark website now!

Status
Not open for further replies.

jmurphy

Well-known member
This script is designed to prepare calibrated frames for ImageIntegration. It adjusts the (brightness) scale and gradient of all frames to match the best frame. This has many benefits; for example:
  • More precise data rejection. It will make it easier to reject satellite trails, hot pixels and cosmic ray strikes.
  • The final stack image will only contain the gradient from the best frame.
The script is an alternative to the LocalNormalization (LN) process. LN can produce fantastic results, but it should not be used blindly. It is trying to solve an 'ill posed' problem. There is insufficient data to determine a single solution. The situation is even more difficult if star profiles between the images vary. It is therefore essential that it is used with care, carefully assessing if the settings are optimum. It should only be used when it is really necessary because if used without adjusting the defaults, it might create undesirable results.

The NormalizeScaleGradient script tries to solve the 'ill posed' issue by using extra data - photometry. The only cost is extra processing time... I have also tried to make it easy to use by calculating 'Auto' default values from the focal length, pixel size and image scale. It will therefore usually produce a good result with its default values.

However, this new script is an alternative, not a replacement to LN. NormalizeScaleGradient assumes that a single scale factor is valid for the whole image. If you have not applied a flat frame, this requirement will not be met, and you should use LN instead. NormalizeScaleGradient is also CPU intensive, so it is not ideal if you are in a rush. After processing the first few images, it estimates the remaining time. If you have a large number of images from a big sensor, you might have enough time to take the dog for a walk! The good news is it needs very little memory, no matter how many images are thrown at it.

1619554176125.png

How to use
The input frames must be registered to each other. First load all the files using the 'Add' button. Note the altitude displayed on the left (blank if this FITS header does not exist). Set the reference frame to the frame with the smallest gradient. Select the worst frame (probably the one with the lowest altitude) and display the 'Gradient graph'.

The 'Gradient graph' displays the horizontal component of the gradient. Since light pollution is usually a smooth variation, this curve should also be smooth.

Select 'OK' and go and get a well deserved drink!

To stack the images, you need to set a few parameters in ImageIntegration, because your images have already been normalized and we don't want this work to be undone. I think the following settings are correct, but please let me know if they are not :)
1619555112428.png

'Normalization' should be set to 'No normalization'. We have already done that job!
However, the 'Weights' method needs to be selected. 'Noise evaluation' is a good option. An alternative is to use the Weight keyword 'WEIGHT' (added to the header by NormalizeScaleGradient).

1619555777701.png

We also need to set the 'Pixel Rejection (1)' > 'Normalization' to 'No normalization'


Install
Unzip the script to a folder of your choice.
In the PixInsight SCRIPTS menu, select 'Feature Scripts...'
Select 'Add' and navigate to the folder.
Select 'Done'
The script will now appear in 'SCRIPTS > Batch Processing > NormalizeScaleGradient'

See message #10 for more details on this script.
See message #47 for the script.

Regards, John Murphy
 
Last edited:
Hi John,

is your script also applicable to calibrated, debayered and registered light frames of an OSC camera?

Bernd
 
I am currently looking into a problem. I will upload a new version once I have fixed it.

John
 
Last edited:
I have been very busy over the last 10 days, modifying and testing NormalizeScaleGradient. It is now ready for release (see the attached zip file) :)

I will start by showing a stacked image of NGC6946 (Fireworks galaxy). The data was taken over two nights, with slightly different framing.

(1) Standard stack.
Image Integration Normalization: Additive with scaling.
Pixel Rejection Normalization: Scale + zero offset.
1620306539854.png

The inbuilt normalization does not tackle gradients. The visible consequences are:
(a) All stacked images have contributed to the gradient
(b) There is a dark vertical strip on the right hand side. Not all frames covered this area, so we would expect this strip to contain more noise. However, since the stack calculates the average, not the sum, this strip should not be darker. The reason it is darker in this example is not due to an error in the code. The standard normalization is doing exactly what it was asked to do. The vertical strip is darker due to gradients in the individual images.

(2) Inbuilt ImageIntegration Adaptive normalization
1620307853167.png

This option is designed to handle gradients, and we can see that it has done a much better job. It is very easy to use, and it is fast. :)
Why not always use this option? Well, gradient corrections are heavily dependent on the detected (brightness) scale. Even small errors in the scale can produce much larger errors in the gradient removal. So gradient removal using Local Normalization or Adaptive Normalization are not the default option. They are used when gradients are large enough to cause problems that out weigh the risks.

(3) NormalizeScaleGradient
1620307978169.png

In this case NormalizeScaleGradient has performed slightly better than Adaptive normalization. The difference is most noticeable around the bright star on the right hand side. However, it did take a great deal more CPU time! So is it worth it?

I believe that the biggest advantage is the method used to determine the image scale - photometry. Provided certain conditions are met (more about that later on), it can produce very accurate results. The gradient model is made more accurate by ensuring that its sample points avoid bright stars and the scattered light around them. The PixInsight process SurfaceSpline is then used to create the gradient to be subtracted.

Due to this increased accuracy (at the cost of processing time) I believe that it is safe to use this process even when the gradients are quite small. Doing so also has an extra advantage. It saves a WEIGHT entry in the FITs header that can be used by ImageIntegration. This is based on the amount each image needed to be scaled, so it is directly proportional to the amount of light that was detected from the object. If the background level of the images vary (for example due to light cloud), this is likely to be a better indicator than the noise evaluation. See post https://pixinsight.com/forum/index.php?threads/question-on-snr.15966/post-96441

The script is designed to be very 'transparent'.
  • Not confident it got the scale right? Look at the photometry graph.
  • How smooth or lumpy is the gradient correction? See the gradient graph. Adjust smoothing if necessary.
It also can save the applied gradients, and gradient 'curvature' files. These curvature files show the gradient after the average gradient has been removed. A flat tilted plane has been subtracted from the surface spline model of the gradient. This shows how smooth or lumpy the gradient correction is. Note that an STF will create a massive stretch on these curvature files. The very small differences need a 24 bit LUT to display well. Also note that Blink does not use a 24 bit LUT.

I have set the default smoothness to very conservative 2.0 which should remove the vast majority of the gradient, with no risk of introducing artifacts. With my own data, I often find that between 0.0 and 2.0 produces great results.

So are there any situations where this script should not be used?
The photometry requires that the data is linear, and that a single scale factor is valid for the whole image. Provided that the images have been flat corrected, this requirement is usually met. However, there are exceptions. For example, if a cloud drifted over the whole of the frame, this is OK. But it affected the frame unequally, the calculated scale might be wrong. Under these situations Local Normalization is probably a better solution.

One final but very important point:
The normalization does not remove all gradients. It normalizes all the images to have the same gradient as the reference frame. It is therefore important to select a good reference frame. Use SubframeSelector to find images with a low median, and blink through your images.

Do not be tempted to apply a DBE to the reference frame. This should only be done after stacking.

Regards, John Murphy

Script is now attached to message #47
 
Last edited:
a couple of questions:

since every single one of my subs has some kind of gradient, there is probably no "best" gradient. will the tool work OK if there is a serious gradient in the reference image? although i wonder if there's any point in creating gradients in the normalized subs, i suppose it would make DBE of the integrated image easier if all input subs had the same gradient. i often end up with some very wild LP patterns in an integration and i assume NSG would eliminate that.

and just out of curiosity how is the gradient modeled? you're giving a smoothness control, so is it like DBE? or a quadratic function like ABE? [edit - sorry, i missed your description of SurfaceSpline]

rob
 
Hi Rob
Some good questions!

(1) If all the gradients are similar, it really does not matter which you choose. In situations like this, I would simple choose the image with the highest altitude. I display the altitude in the left most column in the target image list.

(2) The aim of the script is to make all the images match. This has the following advantages:
  • ImageIntegration: outlier rejection will work more efficiently if the images match. This will improve the removal of hot pixels, cosmic ray strikes, and satellite trails.
  • The stacked result will still contain the gradient of the reference frame, so it wont look pretty! The gradient will still need to be removed - for example by DBE. It is hoped that the gradient in the stacked image will be smaller and less complex.
  • It should allow you to stack images from multiple nights, from either side of the meridian, in one go.
Regards, John Murphy
 
ok, i will test this out on some of my more difficult data and see what happens!
Great! I expect the differences to be quite subtle, but I feel it is worth trying hard to get the most out of our data. It took many hours to collect, after all!
 
Thanks for reporting that. Windows in not case sensitive, but Linux is...
Version 0.4 is attached to message #20
 
Last edited:
Great! I expect the differences to be quite subtle, but I feel it is worth trying hard to get the most out of our data. It took many hours to collect, after all!

i've tried this on some older m101 and m33 images from my 500mm setup. i think the results are an improvement vs. regular normalization. the final gradient is a little easier to remove, though i still had to make a pretty fine DBE grid in both cases. one really good thing is that the ragged edges of the image (caused by dithering) is cleaned up a lot by this flow. it will probably mean less cropping is necessary in the final image.

my images might not be a good test because they do suffer from flattening problems. the m101 image is particularly bad, the m33 not as bad.

i have some images from an 1100mm setup that were quite challenging - i will try those next.
 
John

This is gold dust for those under variable weather conditions (anywhere in the UK!) and light pollution (anywhere in the UK!). Thank you!
Will test it as soon as I can.

Roberto
 
I have made a few minor improvements to the user interface:
  • Added a 'Set reference' button that sets the reference to the selected target image
  • The target image table can now be sorted on either 'Altitude' or 'Filename'. Click on the table header to sort.
  • Minor user interface changes (tooltips, label text)
See message #47 for version 0.11 of the script.
 
Last edited:
Status
Not open for further replies.
Back
Top