NormalizeScaleGradient: Bookmark website now!

Status
Not open for further replies.
Hi John, not certain if it's been reported before but looks like there's a small bug in the altitude sort. Looks like it's sorted based on text value rather than numerical value. I was shooting Leonard which was pretty low down from double digit whole to single digit whole.
 

Attachments

  • Screen Shot 2022-01-03 at 11.01.47 am.png
    Screen Shot 2022-01-03 at 11.01.47 am.png
    36.3 KB · Views: 57
Hi John, not certain if it's been reported before but looks like there's a small bug in the altitude sort. Looks like it's sorted based on text value rather than numerical value. I was shooting Leonard which was pretty low down from double digit whole to single digit whole.
You are the first to report it! That is impressively close to the horizon!
The table column sort assumes text, and unfortunately I don't think I can change it from the JavaScript.
 
Fixes a possible crash when detecting stars

[This is now available as a PixInsight update]
 
Last edited:
Hi John,
I am using NSG1.4.5 for further processing 46 narrowband subs of Rosette Nebula with dual band (Ha/Oiii) filter and cooled color CMOS camera. I had WBPP2.3.2 separate the output subs to RGB. R channel is the strongest. On some R channel subs Detecting Stars (at default -1 setting) stalls at some percentage, ie 48% 60% etc. The process console line first displays 'Detected 2228 stars". Then repeats and get stuck at the percentage. If I increase the Detected Stars setting it will first detect less stars (1656), and then display the detected stars. Other subs work fine with default Detected Stars = -1

I don't think this is surprising to you, but I would like to know why, ie the difference in my subs that causes it to get stuck. on some subs.

BTW, I will be using NWeight for my weighting this time, as I have full confidence in it, and I am not yet convinced PSF signal weight or PSF power weight is the way to go. Some time needed for others to compare/contrast the differences.
Thanks,
Roger
 
On some R channel subs Detecting Stars (at default -1 setting) stalls at some percentage, ie 48% 60% etc. The process console line first displays 'Detected 2228 stars". Then repeats and get stuck at the percentage. If I increase the Detected Stars setting it will first detect less stars (1656), and then display the detected stars. Other subs work fine with default Detected Stars = -1

I don't think this is surprising to you, but I would like to know why, ie the difference in my subs that causes it to get stuck. on some subs.
Does it still happen with the latest version I posted (my previous message)? If so, could you send me the reference and a target image that demonstrates the problem, and I will look into it.
Regards, John
 
Does it still happen with the latest version I posted (my previous message)? If so, could you send me the reference and a target image that demonstrates the problem, and I will look into it.
Regards, John
Hi John,
The updated script does not have the problem. Thanks for your continuous improvements!
Roger
 
John,

If I understand the truncation issue correctly, the safer way to correct it is division so that there is no danger of clipping the target image at the low end. Do you think it might be useful to add a parameter to the target image section, called clipping protection or something, and for NSG to automatically apply this to the target image as a divisor? This would avoid users having to take a pre-processing step with PixelMath and also could offer a default that's likely to work in most circumstances, so that users don't have to guess at an appropriate value.

Practically speaking, to date I have not bothered to try to avoid truncation, mainly because I don't realize it is necessary until after NSG has run and I don't want to bear the computational expense of running it again. But it seems like my star cores might be slightly better if I started using it routinely. Automating this would certainly make that easy.

Happy New Year!

John
My initial answer to this was wrong. So is my advice in the help file. You should NOT rescale only the reference frame. This will result in an inaccurate weight being calculated for that frame. Instead, you would have to rescale all the images.

I am currently working on a new version that avoids truncation by rescaling the images after they have all been normalized.
 
NormalizeScaleGradient 1.4.6 Beta

In this release:
  1. Images are rescaled to avoid truncating high values.
  2. The reference image text field is cleared if it is no longer in the target images list.
  3. Bug fix: If the input file was in Integer format, the corrected file would lose all its original FITS headers. This has now been fixed.
Install
Unzip the script to a folder of your choice.
In the PixInsight SCRIPTS menu, select 'Feature Scripts...'
Select 'Add' and navigate to the folder.
Select 'Done'
The script will now appear in 'SCRIPTS > Batch Processing > NormalizeScaleGradient'

[Edit: Final version 1.5 is attached to message #351]
 
Last edited:
Just a question, not directly related to NSG:
Would it be possible to make a 'PhotometricABE' that would remove background based on photometry ?
John seems to be so comfortable with these Photometry things !
 
Just a question, not directly related to NSG:
Would it be possible to make a 'PhotometricABE' that would remove background based on photometry ?
John seems to be so comfortable with these Photometry things !
Stellar photometry is really good at measuring the brightness scale. NSG can use this scale factor to determine the relative gradient between images. Hence it can accurately normalize all the target images to a reference frame. However, the photometry provides insufficient information to accurately determine what the actual sky background really is.

So I think the answer is that DBE and ABE are still the best solutions for background removal.

Of course, provided the NSG reference frame has less gradient than the other images (perhaps it was taken on a night with no moonlight, for example), the remaining gradient in the final stacked image should be greatly reduced, and should therefore be easier to remove in DBE or ABE.
 
NormalizeScaleGradient 1.5 (final version)

In this release:
  1. Images are rescaled to avoid truncating high values.
  2. 'Rescale result' checkbox has been added.
  3. The reference image text field is cleared if it is no longer in the target images list.
  4. Bug fix: If the input file was in Integer format, the corrected file would lose all its original FITS headers. This has now been fixed.
  5. Bug fix: If the image was severely black clipped (entirely zero / black background), the script could fail. This has now been fixed.
Install
Unzip the script to a folder of your choice.
In the PixInsight SCRIPTS menu, select 'Feature Scripts...'
Select 'Add' and navigate to the folder.
Select 'Done'
The script will now appear in 'SCRIPTS > Batch Processing > NormalizeScaleGradient'
 
John, now I have two versions 1.4.5 and 1.5. How do I clean up the library?
Larry
If you have manually installed multiple versions, delete the versions you don't want to keep. Then in:

SCRIPTS -> Feature Scripts

You can deselect the scripts you don't want, then select 'Done'.
If all else fails, you can use 'Regenerate'.
 
Last edited:
1.4.5 was loaded automatically during the last PI update. I manually installed 1.5 via Featured scripts in my downloads folder. So they are in two different places. In the future should I select the PI src/scripts folder when doing a manual update?
Larry
 
1.4.5 was loaded automatically during the last PI update. I manually installed 1.5 via Featured scripts in my downloads folder. So they are in two different places. In the future should I select the PI src/scripts folder when doing a manual update?
Larry
Normally you should manually install a script to a non PixInsight folder. That way, you don't need root permissions, and when you reinstall a new version of PixInsight it does not overwrite the script.

However, if the script is included with PixInsight, you could use the PI src/scripts folder, because you probably want the script to be replaced when you install the next version of PixInsight. If you do this, make sure you replace all of the NSG files, including everything in the NormalizeScaleGradient/lib folder
 
NormalizeScaleGradient 1.5 (final version)

In this release:
  1. Images are rescaled to avoid truncating high values.
  2. 'Rescale result' checkbox has been added.
  3. The reference image text field is cleared if it is no longer in the target images list.
  4. Bug fix: If the input file was in Integer format, the corrected file would lose all its original FITS headers. This has now been fixed.
  5. Bug fix: If the image was severely black clipped (entirely zero / black background), the script could fail. This has now been fixed.
Install
Unzip the script to a folder of your choice.
In the PixInsight SCRIPTS menu, select 'Feature Scripts...'
Select 'Add' and navigate to the folder.
Select 'Done'
The script will now appear in 'SCRIPTS > Batch Processing > NormalizeScaleGradient'
The 'Rescale result' option does prevent truncation within NSG, but the stacked image from ImageIntegration may still be truncating some star cores. Part of the reason is that if only a few images have the full dynamic range, ImageIntegration's data rejection will reject the star's peak... Another reason is the output from ImageIntegration defaults to 'Average'. The images with flat star peaks influence the result.

However, if the star cores are important, it is easy to fix this:
Find an image with high dynamic range and a good star profile. This highest dynamic range may be the image with the shortest exposure time, or the lowest weight. One way to locate it is to use the Blink process statistics. Select the NSG FITS header 'NSGHIGH'. The higher its value, the greater the image's dynamic range. In the following example, an image that had a very small weight of 9% was used (w009_image), which had an NSGHIGH of 2.26.

We wish to only use the star peaks from this image. To do this, we black clip the image to remove everything except the brighter stars. If any hot pixels are still visible, remove them with CloneStamp. Example PixelMath expression:
1641850836551.png

Now we need to fix the integrated image with these stars:
1641850932927.png


Some example star profiles:
ResampleIntegration.png

Star from original stacked image


RescaleMax.png

Fixed star profile
 
1643056466690.png


New in this version:
  • Improved error handling. In the Output Images section, I have added an On error: combo box, with options: Continue, Abort, Ask user. This fixes an issue where if all the images fail, it was previously necessary to click an 'OK' dialog once for every file.
  • Full paths toggle button. This allows the target image filename to be displayed with or without its full path. The Filename column's tool tip always displays the full path.
  • Extra columns in the Target Images table. All columns are sortable by clicking on the column header:
    • PSF Weight This displays the PSF Signal Weight, normalized to the 0.0 to 1.0 range. Higher is better.
    • Noise This displays a NOISExx noise estimate, scaled by the exposure time and airmass (NSG has not calculated the brightness scale yet). It is normalized to the 0.0 to 1.0 range. Lower is better.
    • Airmass If this is available in the FITS Header, it will be displayed.
    • Time Exposure time in seconds. This is only for display. NSG will scale images correctly without needing to know the exposure time.
    • Filter The filter must match the reference filter.
  • Table double click Double click on a Table row to display the image.
  • If the altitude is less than 10, the number is prefixed with '0' to ensure the numbers are sorted correctly.
Install
Unzip the script to a folder of your choice.
In the PixInsight SCRIPTS menu, select 'Feature Scripts...'
Select 'Add' and navigate to the folder.
Select 'Done'
The script will now appear in 'SCRIPTS > Batch Processing > NormalizeScaleGradient 1.6'

Thanks, John Murphy
 

Attachments

  • NormalizeScaleGradient.zip
    115 KB · Views: 107
Last edited:
Hi Jim, I am currently processing a very large dataset OSC of images with different exposure times and binnings. The problem is that the images with binning 2x2 the weighting is classified much too high. As a reference I use an image taken in Bin1 with 300s exposure time. Which gives a weighting of w100. Only the Bin2 images are 10s exposure time with high gain, the weighting is then over w200. Which has a big impact on integration. There are still images in the dataset on 10s exposure and high gain in Bin1. The weighting is then more like w007 on average. The Bin2 images should then also have a similar value (similar recording conditions). Is it possible that the aliasing effect of the pixel interpolation during registration causes this? I am using version 1.6 of the NSG script:
 
Hi Jim, I am currently processing a very large dataset OSC of images with different exposure times and binnings. The problem is that the images with binning 2x2 the weighting is classified much too high. As a reference I use an image taken in Bin1 with 300s exposure time. Which gives a weighting of w100. Only the Bin2 images are 10s exposure time with high gain, the weighting is then over w200. Which has a big impact on integration. There are still images in the dataset on 10s exposure and high gain in Bin1. The weighting is then more like w007 on average. The Bin2 images should then also have a similar value (similar recording conditions). Is it possible that the aliasing effect of the pixel interpolation during registration causes this? I am using version 1.6 of the NSG script:
I believe that this is a problem for all weight algorithms that depend on the PixInsight NOISExx headers. I explained one of the reasons that an image resize caused problems here:

One option would be to use PixelMath and an ImageContainer to multiply all your binned images by a correction factor. I think 0.2 might get you some where close. Then run NSG.

Regards, John
 
Hi Jim, thanks for pointing that out. I have tried it with the correction factor, now the weighting looks better. How did you arrive at the value of 0.2?
 
Status
Not open for further replies.
Back
Top