NormalizeScaleGradient: Bookmark website now!

Status
Not open for further replies.
Hi Roger. You have produced some interesting data.
Were you using NormalizeScaleGradient 1.1 ? The new version calculates the weights more accurately. Thanks to Adam Block for spotting the error in version 1.0

Image weights should depend on the exposure time, the amount of light pollution, and how much light was absorbed by the atmosphere / cloud.

If the dominant problem is light pollution, I would expect a strong correlation with the image median level. On the other hand, if the dominant problem was absorption, the correlation with the median would be low. There should be a strong linear correlation between exposure time and NWEIGHT (provided the conditions stay the same).
Regards, John

Hi John,
Well you and Adam got me motivated to use this script, and I used it early, before it was a PI release.
My NSG scrip run of 100 images was using Rev 0.7 or 0.8. It took over an hour to run. Capturing data (from PI) and putting it into a spreadsheet was a several evening effort. I don't want to do it again just for additional improvement in analysis. I think the results are good to demonstrate the correlations I posted.
I did get a strong correlation with Median, and with Stars Detected, and my main issue was light pollution.
Thanks,
Roger
 
NSG is not currently compatible with drizzle. To make it compatible, NSG needs to output .xnml files. This is not easily done from JavaScript. Hence, this functionality will have to wait until I have ported NSG to C++. That needs doing anyway for performance reasons.

Is there a way of getting the current script to work with drizzle? The short answer is no, there is no good way of doing this. I have not tried it, but there might be a very hacky way of doing it. This hack (if indeed it works) would require mono files, so you would need to extract the R, G, B from your color camera.
  • If color, extract R, G, B from CFA
  • If there is a meridian flip, use 'FastRotation' to rotate the images by 180 degrees either before or after the flip.
  • Run NSG on the unregistered files. To make this work, increase the 'Photometry Star Search' -> 'Star search radius' to its maximum (20). Provided the shift between images is less than this search window, it should still manage to match the ref / tgt stars. Reduce the 'Star flux tolerance' to reduce the risk of invalid star matches. Try 1.2
  • Use the 'Photometry stars' dialog to check that the ref / tgt stars have been correctly matched. Test on the image with the greatest shift from the chosen reference image.
  • Increase the sample size to 1.5 times the 'Auto' default size.
  • Only use smooth gradient corrections. The default of 2 should work well. Don't use less than 1.0
  • Deselect 'ImageIntegration'.
Then, do your drizzle processing as normal, starting by registering the _nsg files with the 'Generate drizzle data' selected. Make sure you set up the ImageIntegration process with 'No normalization' and NWEIGHT. Note that DrizzleIntegration will then go back to the unregistered, but scale and gradient corrected, _nsg files.

As you can see, it is a very hacky solution... It is probably better to wait for the C++ version that will do the job properly.

If anyone tries this, do let me know if it works!

John

Why would you need .xnml files for drizzle integration? Those are Local Normalisation files.
Unless I am mistaken, only .xdrz files are necessary for Drizzle Integration and those arise as a result of the Image Registration. Since the NSG script works on the pixel values of the .xisf files and by Integrating those the .xdrz files can be updated with new weights and pixel rejection, why would the new .xdrz files not work straight after in Drizzle Integration?

Roberto
 
NSG needs registered files, right?

i think generating the .xnml files would make NSG fit into the same place in the flow as LocalNormalization does now. currently the .xdrz files point back to the un-registered and thus un-normalized files, so you can't drizzle the normalized files.

this is why in the quoted post john is suggesting trying the hack of running NSG on the unregistered files. in that way SA and II run on the NSG files because they have been normalized before StarAlignment, and the .xdrz files will reference the output files of NSG. but there's no real guarantee this will work because NSG has to search around in the dithered files to make star matches.

rob
 
NSG needs registered files, right?

i think generating the .xnml files would make NSG fit into the same place in the flow as LocalNormalization does now. currently the .xdrz files point back to the un-registered and thus un-normalized files, so you can't drizzle the normalized files.

this is why in the quoted post john is suggesting trying the hack of running NSG on the unregistered files. in that way SA and II run on the NSG files because they have been normalized before StarAlignment, and the .xdrz files will reference the output files of NSG. but there's no real guarantee this will work because NSG has to search around in the dithered files to make star matches.

rob
Yes, that is correct. The .xnml files contain the nomalization information - the scale and gradient, so that it can be applied by a process later on in the chain.

The hack might work - it is quite good at matching stars - provided the maximum shift does not exceed the search window, but you would need to check (Photometry star dialog). The star matching is quite robust because it takes into account the expected star flux. The major problem would be with the gradient calculation - the samples would correspond to slightly different squares in the reference and target images due to the image shifts. This is why you would need to be conservative about the sample size (not too small) and the amount of gradient smoothing applied (more smoothing is better). You would also have to set up the ImageIntegration settings yourself. As you can see, running it like this would add extra work. Perhaps too much.

I could spend time making the hack work better. For example, I could shift the sample grid as the images shift. However, I reckon it is better that I spend this time relearning C++ (I last used it over 22 years ago) and porting the code to C++.

John Murphy
 
I didn't know this Rob. Glad to stand corrected.
Thanks

Roberto

Is this ONLY for CFA drizzle or all kinds of drizzle ? I know if I enable CFA, I need to point to the cosmetic-corrected un-debayered folder in the format hints section or it wont find the files to drizzle, but with regular drizzle I dont need to tell it this
 
Is this ONLY for CFA drizzle or all kinds of drizzle ? I know if I enable CFA, I need to point to the cosmetic-corrected un-debayered folder in the format hints section or it wont find the files to drizzle, but with regular drizzle I dont need to tell it this
The hack would only work for mono files. It's also untested ...
The C++ version, utilizing .xnml will work for both mono and CFA
 
The hack would only work for mono files. It's also untested ...
The C++ version, utilizing .xnml will work for both mono and CFA
Got it. For now this seems to give me more benefit than I lose by not drizzling but more experiments needed
 
Hi John,
I think this was interesting, but wanted to ask you if valid....
As I previously stated I used NSG script (ver 0.7 or 0.8) and then created the integrated image from the 100 .nsg images.
I repeated using normal integration (with normalization and noise evaluation) with the non NSG script images. I could not tell visually which image was better.

I wanted to really know if there was a quantitative difference as I know some images with high weight (from image integration) have low NWEIGHT, and did not look very good. Same reference image throughout.

To quantify the differences I ran NSG script (now Ver 1.1), but using only the 2 above integrated images. I made the NSG script integrated image the reference image. The output was interesting:

Visually the NSG script image looked very slightly better in the galaxy than the normal integrated image followed by the running the NSG script.

FITS header data of the .nsg suffix image of 2 integrated images:
NWEIGHT of the integrated NSG image, now with .nsg was 1.000 as expected.
NOISE00: 3.3379e-04
NOISE01: 3.2226e-04
NOISE02: 2.9410e-04

NWEIGHT of the normal integrated, now with .nsg image was 1.000669
NOISE00: 2.9100e-04
NOISE01: 2.9788e-04
NOISE02: 3.2744e-04

The Photometry graph was perfectly linear (all point on the line), with different slopes for RG&B.
The Gradient graphs showed the more complex gradient of the normal integration image.

Do you think above usage of the script is valid?

Perhaps the photometry method you used in your script can be incorporated into Subframe Selector for those imagers and images that do not need the power of gradient matching in the full script. I wonder if the Subframe Selector noise would be removed.

Thks,
Roger
 
Hi John,
I think this was interesting, but wanted to ask you if valid....
As I previously stated I used NSG script (ver 0.7 or 0.8) and then created the integrated image from the 100 .nsg images.
I repeated using normal integration (with normalization and noise evaluation) with the non NSG script images. I could not tell visually which image was better.

I wanted to really know if there was a quantitative difference as I know some images with high weight (from image integration) have low NWEIGHT, and did not look very good. Same reference image throughout.

To quantify the differences I ran NSG script (now Ver 1.1), but using only the 2 above integrated images. I made the NSG script integrated image the reference image. The output was interesting:

Visually the NSG script image looked very slightly better in the galaxy than the normal integrated image followed by the running the NSG script.

FITS header data of the .nsg suffix image of 2 integrated images:
NWEIGHT of the integrated NSG image, now with .nsg was 1.000 as expected.
NOISE00: 3.3379e-04
NOISE01: 3.2226e-04
NOISE02: 2.9410e-04

NWEIGHT of the normal integrated, now with .nsg image was 1.000669
NOISE00: 2.9100e-04
NOISE01: 2.9788e-04
NOISE02: 3.2744e-04

The Photometry graph was perfectly linear (all point on the line), with different slopes for RG&B.
The Gradient graphs showed the more complex gradient of the normal integration image.

Do you think above usage of the script is valid?

Perhaps the photometry method you used in your script can be incorporated into Subframe Selector for those imagers and images that do not need the power of gradient matching in the full script. I wonder if the Subframe Selector noise would be removed.

Thks,
Roger
NSG versions prior to v1.1 had an error in the NWEIGHT calculation; the ratio should have been squared. Hence, although the earlier script was correctly scaling the images and correcting the gradient, it was miscalculating the weights. This will have led to a sub optimal result from ImageIntegration. Alas, the analysis that you did of NWEIGHT for the earlier versions is now redundant due to this error.

To have meaning full numbers, you first need to reprocess your images with NSG 1.1 and integrate them. Then we can start looking at these new comparisons.
Regards, John Murphy
 
Last edited:
Is a commercial license required to activate the script? Using the trial version I updated, appears to download ok but I get this error message.
"
"Processing script file: C:/Program Files/PixInsight/src/scripts/JohnMurphy/NormalizeScaleGradient/NormalizeScaleGradient.js
*** Error: *** Error: C:/Program Files/PixInsight/src/scripts/JohnMurphy/NormalizeScaleGradient/NormalizeScaleGradient.js, line 37: include file not found: lib/LeastSquareFit.js"

Files are from a OSC, calibrated, registered and masters created (dark, bias, lights)
 
Is a commercial license required to activate the script? Using the trial version I updated, appears to download ok but I get this error message.
"
"Processing script file: C:/Program Files/PixInsight/src/scripts/JohnMurphy/NormalizeScaleGradient/NormalizeScaleGradient.js
*** Error: *** Error: C:/Program Files/PixInsight/src/scripts/JohnMurphy/NormalizeScaleGradient/NormalizeScaleGradient.js, line 37: include file not found: lib/LeastSquareFit.js"

Files are from a OSC, calibrated, registered and masters created (dark, bias, lights)
This looks like you are missing some files. Perhaps the update failed. The script is attached to this message:
 
also sometimes windows AV programs intercept some subset of the javascript files that comprise an update and break things, so check if you are running AV software and if it's quarantined that file.
 
NSG versions prior to v1.1 had an error in the NWEIGHT calculation; the ratio should have been squared. Hence, although the earlier script was correctly scaling the images and correcting the gradient, it was miscalculating the weights. This will have led to a sub optimal result from ImageIntegration. Alas, the analysis that you did of NWEIGHT for the earlier versions is now redundant due to this error.

To have meaning full numbers, you first need to reprocess your images with NSG 1.1 and integrate them. Then we can start looking at these new comparisons.
Regards, John Murphy
Hi John,
Now I realize I was comparing apples and oranges when using the Pre 1.1 scripts.
I will rerun with NSG 1.1 and see what I get. I have a little experiment in mind.

Thanks,
Roger
 
Hi John,
All the flexibility in photometry selection and sample generation is amazing, and greatly appreciated.
The contour graphs are amazing too. Your mouse overs are very clear and understandable.

Further studying the Photometry Stars and Sample Generation as I prepare to run Ver 1.1, I have a few questions. See attached file corresponding numbers.

  1. In Photometry Stars graph, if two stars are selected and both outer box overlaps each other, should I check one or both stars are rejected in Sample Generation graph? Then manually reject the other?
  2. In Photometry Stars graph if one photometry star has another star inside outer box should it be manually rejected in Sample Generation graph?
  3. If there are stars near the edge of the image showing in Sample Generation graph, but they are not selected as Photometry Stars, should they be manually rejected?
  4. In most images the border is black on 2 sides due to registration. Does this cause any problems?
In a set of 50 or so registered images would you expect the same number (maybe more, maybe less) of photometry stars be found in each image?

If I manually reject stars in Sample Generation graph, will it affect those stars showing if I go back to the Photometry Stars graph?

Thank you very much.
Roger
 

Attachments

  • Phometry Stars needing rejection.jpg
    Phometry Stars needing rejection.jpg
    680.2 KB · Views: 63
  1. In Photometry Stars graph, if two stars are selected and both outer box overlaps each other, should I check one or both stars are rejected in Sample Generation graph? Then manually reject the other?
  2. In Photometry Stars graph if one photometry star has another star inside outer box should it be manually rejected in Sample Generation graph?
  3. If there are stars near the edge of the image showing in Sample Generation graph, but they are not selected as Photometry Stars, should they be manually rejected?
  4. In most images the border is black on 2 sides due to registration. Does this cause any problems?
In a set of 50 or so registered images would you expect the same number (maybe more, maybe less) of photometry stars be found in each image?

If I manually reject stars in Sample Generation graph, will it affect those stars showing if I go back to the Photometry Stars graph?
  • The stars used for stellar photometry (Photometry Stars and Photometry Graph) are completely independent of the stars used by the Sample Generation plot. So increasing or decreasing the number of sample rejection circles (circles around bright stars) in the Sample Generation plot has no effect on the photometry. Rejecting stars from the stellar photometry has no effect on the Sample Generation plot.
  • The photometry outer rectangle measures the background sky flux. It uses the median of the pixel values, which does a good job of excluding stars. Therefore, these stars usually have minimal effect on the measured star flux.
  • It is OK for the photometry inner square to include more than one star, provided that this occurs for both the reference and target images. This will not affect the ratio between the reference and target star(s) flux.
  • Provided that the border around the image is black (zero) the code knows to ignore it. It should be black provided you have not processed images between registration and NSG. If a black border intersects with the photometry inner square, that star will be automatically rejected from the photometry. The photometry squares will no longer be drawn around the star. In Sample Generation, if too much of a square contains black pixels, that sample is automatically rejected.
  • The number of photometry stars is determined by several factors. The star must be detected in both reference and target image. Its peak value must be less than the specified Star peak value. If Limit stars % is less than 100, the faintest stars will not be used. Clearly the imaging conditions will affect which stars will comply with these conditions.
Regards, John Murphy
 
Last edited:
Hi John,
Now I realize I was comparing apples and oranges when using the Pre 1.1 scripts.
I will rerun with NSG 1.1 and see what I get. I have a little experiment in mind.

Thanks,
Roger
H
  • The stars used for stellar photometry (Photometry Stars and Photometry Graph) are completely independent of the stars used by the Sample Generation plot. So increasing or decreasing the number of sample rejection circles (circles around bright stars) in the Sample Generation plot has no effect on the photometry. Rejecting stars from the stellar photometry has no effect on the Sample Generation plot.
  • The photometry outer rectangle measures the background sky flux. It uses the median of the pixel values, which does a good job of excluding stars. Therefore, these stars usually have minimal effect on the measured star flux.
  • It is OK for the photometry inner square to include more than one star, provided that this occurs for both the reference and target images. This will not affect the ratio between the reference and target star(s) flux.
  • Provided that the border around the image is black (zero) the code knows to ignore it. It should be black provided you have not processed images between registration and NSG. If a black border intersects with the photometry inner square, that star will be automatically rejected from the photometry. The photometry squares will no longer be drawn around the star. In Sample Generation, if too much of a square contains black pixels, that sample is automatically rejected.
  • The number of photometry stars is determined by several factors. The star must be detected in both reference and target image. Its peak value must be less than the specified Star peak value. If Limit stars % is less than 100, the faintest stars will not be used. Clearly the imaging conditions will affect which stars will comply with these conditions.
Regards, John Murphy
John,
Thank you for your very clear reply and explanations on the photometery boxes. Now I know I do not have to worry chasing each box in the photometry squares.

The script you developed, and the understanding behind it is very robust. Thank you.

I am still hoping a simplified, weighting only, version (that does not produce the nsg files), can be implemented as part of SubframeSelector, maybe replacing SNR Weight. The graphs in Subframe Selector helps visualize data parameters taken on multiple nights.

Roger
 
I am still hoping a simplified, weighting only, version (that does not produce the nsg files), can be implemented as part of SubframeSelector, maybe replacing SNR Weight. The graphs in Subframe Selector helps visualize data parameters taken on multiple nights.
Roger
You can copy the console 'Summary' text to create a CSV text file. Do a global replace on 'NWEIGHT' to 'NWEIGHT,' and ']' to '],' which will allow you to use a comma as the separator when importing into a spreadsheet. You should then be able to graph it.

Perhaps a future version of SubframeSelector will allow you to import a value from the FITS header. This could then be set to NWEIGHT.
 
Last edited:
Hi John,
I went back and processed my( M101), 100x600sec, @659mm, cooled CMOS color images using Ver1.1 of the script. I will break my results down into a couple of messages. My work may not be correct for those imagers that are located in suburban or dark sites. Although clouds can occur anywhere.

As I said before, my sky background is urban light polluted and some images with those pesky light clouds which may deceive weighting by normal integration's noise evaluation or using Subframe Selector. I think your script is a perfect match for high median images.
I found:
  1. The Ver1.1 script ran almost 2X faster than the old Ver0.8. Very important when setting the Gradient Correction Smoothness to 0.0 (vs default of 2). Now the run time was reduced to 47minutes on my Dell XPS15 -9570, apple to apple.
  2. It is very useful, and thus important, to study the Gradient Graph and then adjust the smoothness to reasonably follow the data of reference image. Look at each color one at a time on the graph to really see the gradient in detail. Move the graphed position up and down and left and right.
  3. The calculation of NWEIGHT is very sensitive, and varied from 0.31 to 1.6 for my chosen reference image.
  4. My reference image really did not have a simple gradient, so I expected DBE after integration would not be so easy. It was not easy! I think it is important for users to carefully choose the image with the best gradient-- the one that will be easiest for DBE. Don't just pick the image with the best looking signal.
  5. The script has an slider for Image Integration to set the minimum weight percentage to be used in Image Integration. Default is 50%. With this setting 44 of 100 images showed up selected in Image Integration. I integrated these 44 images, then 76 best NWEIGHT images, and then all 100 images.
  6. After integration, 76images had clearly less noisy background, and a bit more visibility in the galaxy arms. Integrating all 100 keyword NWEIGHT images was very, very slightly better. It certainly did not hurt the result to include the worst images. Their weight contribution was small also.
  7. The script generates a huge amount of data. My 3GB of 100 registered images became 18GB of .nsg files, ready for further processing. Does a 31MB image really need to become 191MB image?
  8. It is satisfying to me (my personal) to see the gradient graphs and photometry graphs (3 colors) and understand what the script is doing, and how the different settings will work on the data.
  9. The "zooming" in the Photometry Stars graph, and Detected Stars graph is very nice because the zooming in is centered on the cursor. I posted a request to have Pixinsight zooming to use this method. I always seem to get lost when zooming an image in PI.

Further Goals:
  1. One of my additional goals in using the NSG script is to quantitatively evaluate two, or more, integrated images to determine which is best to proceed for further integration. As I have written here before, there is no method to quantitatively measure if an image is better than another. Different PI book authors have written -- good signal to noise, and different processes can improve signal to noise. But none of the books has a method to quantify it. Thus my interest in using NWEIGHT to judge which image is better.
  2. I also want to confirm NWEIGHT is more reliable than normal Image Integration 'noise weight', and if perhaps a different parameter is good, such as Stars or Median in Subframe Selector.
  3. I want to confirm NWEIGHT results in a better looking image (sorry, no way to measure it) than the traditional Image Integration.
I will speak to these in a follow on post. Now I will say it again John....

Thank you for writing a great script, and making it very functional to use.

Roger
 
Status
Not open for further replies.
Back
Top