Local Normalization and Airplanes/Satellites rejection not working well

Linwood

Well-known member
I've been accumulating data and after each night I run a draft integration with WBPP. I've noticed the result still has hints of airplane trails in them, despite an ever growing number of subs.

I just did a test and integrated by hand, using LN first (and got the same result), then turning off local normalization in favor of additive with scaling and scale + zero offset - everything else the same, and the trail disappeared in my sample (if you really squint you may see some vague hint of it, but largely gone). I was using ESD for rejection with defaults, but again, no change other than LN being turned off. There are 199 subs, so it's not a small sample, and I blinked carefully to ensure there was only one sub with a trail in this location. It's not the only one (and if I recall one of the others has 289 subs).

Here is an example at 1:2, hopefully it will preserve the size of the attachment so you can see it.

Should LN cause rejection to work more poorly? It's a good sized heavy trail, but not exactly what I would think was a problem, especially with only 1 of 199 with it?

Linwood

PS. This was PI LN files, nothing to do with NSG generated ones

LN_Trails.jpg
 
This should not happen. We have made many tests with sets including airplane and satellite trails much heavier and larger than this, and LN works perfectly. So something very strange is happening here and we must discover it.

I need this data set, or at least a subset where this can be reproduced. Ideally, the entire set of 199 calibrated (unregistered) frames. Can you please upload it?
 
At any rate, I have an important improvement already designed for the LocalNormalization process, which will solve all of these rejection-related issues once and for all. For the implementation of this improvement, this data set would be of great help.

A general reflection. We always need problematic data sets, where the problems happen and can be reproduced, to improve our algorithms and their implementations. Lacking the mental brilliance required to solve problems by just looking at them (inspiration, illumination, etc), we unfortunately have to rely on hard work, which requires data for finding problems and testing solutions.
 
Adam, yes, when I ran it manually I could see. I'll set everything up this morning with fresh eyes and run again.

This time I'll save off the icon before integration and share, maybe you will see something odd in the integration settings.

The reason I pursued it, is for months I've been just ignoring satellite trails as they magically vanished during integration, but the last few runs not so much. I was not worried really as it has taken maybe 45 days of cloudy weather to get the data I have so far, so was not finalizing anything. I cannot say it started with 1.8.9, but that feels like about the right time.

Juan, This is a full frame 61mpix camera so the files are huge. The registered files alone are 45g. I can probably find a place to upload that, but it will take about a day (slow internet). However before I do....

What files do you want, i.e. how far back in the processing chain is useful? Do you want the original subs and flats with master darks? Master flats? Cosmeticized with an alignment master (I used a generated one)? Or registered? Or try to cut it down to 30 or so subs and reproduce there?

I do want to carefully run it again this morning. I have tested NSG a lot a few weeks back and just found out it does (to me) a really bad thing -- it overwrites the process icon you give it for integration, and I think resets it. So my "normal" integration icons are all hosed from when I was testing it, and I need to recreate them. I did not realize it did that until last night when I was exploring these options, found the wrong settings, and went digging. So let me walk through the whole integration more carefully also. But... that would not have affected WBPP.
 
I have tested NSG a lot a few weeks back and just found out it does (to me) a really bad thing -- it overwrites the process icon you give it for integration, and I think resets it.
I do not think this is correct. In my testing, NSG never did anything to the ImageIntegration process icon I gave it.
 
I do not think this is correct. In my testing, NSG never did anything to the ImageIntegration process icon I gave it.

Just for clarity this issue (or not if I'm wrong) confused me, but has nothing to do with my current concern other than it may have screwed up my icons.

As a test I just ran a few images through NSG without auto-run, and it said this at the end, certainly implying it replaced my icon settings.

I'm re-setting all the icons, and going to run WBPP from the beginning then use those files for a manual integration with LN, and then run again with that same integration process without LN checked. It will take a while, but should confirm whether the trails were my mistake, or an artifact coming out of the LN (or more precisely not being rejected with LN used). Give me a few hours. If it's worth discussing the icons, maybe we should move that to another thread (or to that site).

nsg.jpg
 
What files do you want, i.e. how far back in the processing chain is useful? Do you want the original subs and flats with master darks? Master flats? Cosmeticized with an alignment master (I used a generated one)? Or registered? Or try to cut it down to 30 or so subs and reproduce there?

I need the calibrated subframes. Without cosmetic correction. Just after ImageCalibration. Ideally I would need the entire set of 199 images that you have mentioned. But if you manage to find a smaller subset where the same problem can be reproduced consistently under the same conditions that you have reported, that can also work. Thank you in advance for your help!
 
If it's worth discussing the icons, maybe we should move that to another thread (or to that site).

Much better on a site dedicated to the aforementioned third party (and competing) product.
 
Thanks, Juan, let me first do all this over from the very beginning and make sure I did not make some stupid mistake. Then I'll try integrating just 25 or so frames and see if the same thing happens, and go accordingly. Just deleted all intermediate files and starting WBPP now...
 
I did reproduce it from scratch - WBPP has the trail, a manual integration using the WBPP's LN files has the trail, a manual integration using the same process, just deleting the LN files (which resets to Additive with scaling and scale + zero offset) almost entirely removes it.

What's interesting is there are a LOT of subs with satellite trails.

I tried taking 25 subs including the one with the trail, did it with and without LN, and saw little if any difference (i.e. the trail was in both). So the entire set is probably needed.

I packaged up the alignment master (in case you want it), the calibrated but not cosmetically corrected images, the icon for cosmetic correction (just in case) into a zip file, and three images post integration - one from WBPP, one from a manual integration with WBPP's LN files, and one without LN files. I did a zlib compression on the files before zipping as I think it may do better than zip would. But these will likely take most of a day to upload, so I'll post a link tomorrow.
 
I have been looking at this more carefully while watching my slow internet, and one thing I noticed is that if I blink through the registered images, the night this image was taken (2022-03-26_23-11-19 is the time stamp on it by the way) there is a weird blotchy pattern you see moving as you blink. Other days do not have that. The calibrated and the cosmetically corrected versions do not have that, only the registered subs (done by WBPP), and only that night. It is subtle, looking at one image and I do not really see it, but blink shows the pattern because it shifts.

I have no idea if this has anything to do with the issue, but it is that night that contains the only trail that remains, at least only one I noticed in this filter (there are similar trails in others). I know of nothing different from that night, same temp, same exposure, same gain, same dither (every 2). Might be nothing. Just a FWIW.

Google says I have another 7 hours to go uploading. Crashing for the night, will post a link in the morning if it is complete.
 
Thank you *so much* for uploading this data. I know how hard can this be with a slow connection—I also suffered that, not too many years ago. I am going to download it right now. I'll let you know when I have it.
 
I still suspect I did something incorrectly, but hopefully finding my mistake will help others as I can't see it.

Not necessarily. It could be that this particular plane trail is not being rejected well in this particular frame, generating a large-scale artifact in the local background model used for generation of local normalization functions. If this happens, the trail cannot be rejected well because the local background contains (part of) it.

The new version of the LocalNormalization tool comes with a new rejection algorithm that must solve these problems completely, but for its implementation I need data sets exactly like this one, so this is going to be of great help for me.

I still haven't been able to download the file, since it seems a lot of people is already downloading it and the server is over quota. it seems this problem has raised a lot of expectation/interest :)
 
Wow, that's Google and a enterprise account, surprised there is such a concept as over-quota.

I should note I did not try large scale structure rejection, and suspect that might have behaved differently.

I should also note that other filters had these artifacts as well, though Green was the one with the least subs, so I used it. I had almost 300 for Red which had a very prominent one remaining. If you get something you want tested on it, and it is practical to do, would be happy to run it. I need more time on this target generally (trying to get the tidal tail) and no clear nights expected soon, so I have time. :(

Should you have opportunity, please blink through after registering, zoomed all the way out, and as you go through the night of 3/26 see if you see anything unusual in the background, and if so I'd love to know what I might have done to cause it. And if it is related to the other issue.
 
Thank you. I have downloaded the file, so you can remove it when you want. I'll come back when I have some results.
 
Hi Linwood,

I have completed a first series of tests, and have implemented one of the important improvements in the LocalNormalization tool, which will be released with version 1.8.9-1 of PixInsight. Here is the result with your data set:

Desktop1.jpg


Normalization and rejection are now perfect. I have used large-scale rejection as you can see in the above screenshot. I'll contact you privately with a link to the integrated XISF image.

The first problem with this data set is that it contains a lot of bad frames that must be excluded from the integration. The first step is using SubframeSelector to analyze the calibrated data:

Desktop2.jpg


As you can see, only a subset of 56 frames is worth to be used in this set. The rest of frames must be excluded because they can only damage the result. To give you an idea of the reasons for this decision, see above the Noise property represented in the Measurements window. All of the rejected frames have median values at least 14 times higher than the good subset (6 frames), and the vast majority at least 26 times higher (137 frames). In version 2.4.2 of the WeightedBatchPreprocessing script we are going to implement a new feature for automatic rejection of bad frames.

Once we have selected the good subset and registered it, we can apply LocalNormalization with default parameters. In version 1.8.9-1 of PixInsight, LN has a new parameter that controls rejection of saturated pixels, namely High clipping level:

LN.png


This parameter and its associated feature, which works perfectly with its default value of 0.85, prevents generation of dark artifacts around large saturated structures, such as the bright star in your image. With this change and another improvement planned for the rejection routine, LN is now a completely robust and extremely accurate normalization process in version 1.8.9-1 of PixInsight.

Thank you for uploading this data set. It is being very useful for the development of important improvements in some of our most critical tools.
 
Hi Juan,

Glad to hear about the advances in the LN tool. I will be curious to see if this improves the satellites problem in my witch head dataset.

John
 
Back
Top