BlurExterminator solarizes view

GaryP

Well-known member
I have another 10 days on my free trial of BlurExterminator. Time is short and it's running out. I need to find out how well it works. Six JWST files of Stephan’s Quintet were combined in PixelMath in the usual way in order of wavelength, grouped into three pairs with R = (f444 + f356), G = (f277 + f200)/2, and B = (f150+f090)/2. The result was the view labeled “Image15” in the screen shot. Then BlurExterminator was applied with the default values. The result was the second view from the right which appears to be hopelessly solarized. One can see that there is no STF applied, but the usual STF just aggravates the problem. I was unable to make any constructive changes to the view with STF or HT. I am wondering why this happened and how to do better.

The RGB file prior to the application of BlurExterminator is here: Image15
Screen_Shot.png
 
Last edited:
something is strange about this data (Image15). the green channel has a background of 0.000 and the other two channels have fixed background values.

something's gotten clipped/clamped somewhere. did you linearfit the filter images to the partner that it's being summed with? did you tick "rescale" in pixelmath?

rob
 
I did tick "rescale" in pixelmath. I did not linear fit the images. I'm not sure what you mean by "the partner that it's being summed with". Do you mean they should all be fit to one reference image, or that each of the pairs should be linear fit. There are two pairs that have disparate backgrounds, so I will try fitting those. I was under the impression that color calibration would do this, and that was to be my next step.
 
i was trying to say that you really only have to linear fit each pair of images (f444 and f356, f277 and f200, etc.) since you'll be trying to add them together.

if the pixel values in f444 wildly differ from f356 for whatever reason, the sum or average is going to be weighted toward one of them.

it might be worth checking the individual filter images to see if they are messed up somehow, since having ticked rescale should have taken care of any out-of-bounds values. it doesn't make sense to me that the backgrounds of all 3 images are clamped to the same exact value everywhere.

yes - color calibration will fix whatever offsets are between the resultant R/G/B channels in the combined image, but it can't do anything about the properties of the images that were combined to create each channel.
 
it might be worth checking the individual filter images to see if they are messed up somehow, since having ticked rescale should have taken care of any out-of-bounds values. it doesn't make sense to me that the backgrounds of all 3 images are clamped to the same exact value everywhere.
Any suggestions regarding what I should look for and how to check it would be helpful.
 
it's not normal for any image from a digital sensor to have the same background value everywhere, so check those filter images to see if they already look like that. i had been assuming that the clamping came from the pixelmath operation, but maybe not.

where did those filter images come from? are they integrations of calibrated JWST subexposures? or just subexposures? were they calibrated by you or by STScI?
 
I'm afraid I don't know how much integration and calibration the JWST images undergo before they get to the public. Those are two of the many things I have yet to learn. Under STF, they appear to me to have significant variation in background. Two of the original six had what I would regard as a flat background under STF, as opposed to black, but they were still linear when combined (If I remember correctly. After experimenting for a few hours, it is hard to keep track, and starting over means a significant time reacquiring them from MAST. I have already started over at least twice.) I am starting over from what I think are still linear images to see if I can get a better result.
 
I'm afraid I don't know how much integration and calibration the JWST images undergo before they get to the public. Those are two of the many things I have yet to learn. Under STF, they appear to me to have significant variation in background. Two of the original six had what I would regard as a flat background under STF, as opposed to black, but they were still linear when combined (If I remember correctly. After experimenting for a few hours, it is hard to keep track, and starting over means a significant time reacquiring them from MAST. I have already started over at least twice.) I am starting over from what I think are still linear images to see if I can get a better result.
Google "jwst processing pipeline" and you'll find the descriptions of the primary processing (calibration and stacking) and various post processing tools. It's well documented.
 
Google "jwst processing pipeline" and you'll find the descriptions of the primary processing (calibration and stacking) and various post processing tools. It's well documented.
I think I did this once, but at the time it was not something that I understood much of. I will take another look at it. In fact, one of the filter images I was trying to use on this project had a lot of vertical and horizontal striping. I tried to get rid of it, but the task was beyond me, so I just used five filters, and I'm reasonably satisfied with the results for now. I attach a little cut out of a galaxy that is quite tiny in the main image. It appears a couple of centimeters from the left edge and near the center horizontal of the image above. You can get an idea of its size in the image by comparing it to the distant red galaxies. BlurExterminator sharpened it up a bit.

deep_field_galaxy.jpg
 
Back
Top