Red centers in stars?

macbates

Member
Apr 26, 2019
9
0
I’ve been having a problem lately with many of my processed images having stars with red centers, and although I know when this is happening in the processing chain, I can’t figure out why. For the stars in question, if I examine the Canon raw (CR2) files with the readout cursor, the RGB values are 1, 1, and 1, indicating that the star is saturated, and the stars show as white, which is expected with those RGB values.

After running the images through the WBPP script however, those stars have a bright red center, and the readout cursor now shows (typically) RGB values of 1.0, 0.15, and 0.15. With those values I would expect red to be the dominant color, but I don’t understand why the RGB ratio changed. I did some further looking, and it turns out that the stars “change color” after the debayering process. To investigate further, I processed the files by hand instead of using WBPP (bias and dark integration, flat calibration and integration, etc), and right after the debayering process, the output files had stars with red centers.

This is apparently due to the stars being saturated, so for one last test I fed the same raw files through Deep Sky Stacker. The resulting stacked and registered output file from DSS showed the stars as white, with no red centers. Loading the DSS output file into PI, I checked the same star with the readout tool, and got RGB readings of 0.706, 0.865, and 0.782. A bit on the greenish blue side, but that might be explained due to the Astronomik CLS filter in the camera.

So, from the above (condensed from many hours of experimenting) it appears that the Debayering process in PI is turning any and all saturated stars into ones with red centers. More accurately, all pixels with an RBG value of 1,1, 1 are now turned into pixels with red being the dominant color.

I’m stumped as to where to go from here and/or how to fix this problem. As far as I can tell I have all PI settings correct, including using pure raw. Any ideas as to what the problem might be or what I should try next?

Thanks in advance for any ideas, suggestions, or help,

- Ken
 

pfile

PTeam Member
Nov 23, 2009
5,099
41
as far as i understand this, it is due to the 14-bit ADC in canon cameras. when a star is saturated in a given channel, the saturated pixels have the value 8192. when 14-bit files are opened by libRAW under "pure raw" they are converted to 16-bit integers, and the values are carried straight over to the 16-bit values with no "expansion". so now what was saturated ends up looking like 0.25 in pixinsight - 8192/65535 (65535 would be a saturated pixel in a 16-bit integer file.)

probably DSS is expanding the 14-bit values to the 16-bit integer space, essentially shifting the data left by 2 places.

i think you can force libRAW to do this by first clicking "pure raw" in the RAW format explorer and then untick "no highlights clipping". if you read the tooltip for that setting you'll see that it mentions clipped pixels taking on various shades of pink.

there is probably a format hint for this but i'd have to search for it. if you are using the "raw cfa" format hint that would probably override the checkbox, so if you are testing this make sure to remove the format hint first.

rob
 

bulrichl

Well-known member
Nov 2, 2016
692
34
La Palma, Canary Islands
Canon raw files (in CR2 format) are CFA data, so they don't have RGB values. If I examine a Canon raw files (in CR2 format) of my EOS 600D (= Rebel T3i; 14-bit ADC), the intensity values in the core of saturated stars are 0.2335 in the normalized real range. This corresponds to 15305 ADU in the number format unsigned 16-bit integers and corresponds to saturated intensity. I am not aware that Canon released a camera with a 16-bit ADC, so I cannot imagine that numbers >= 0.25 are possible in the normalized real range. If I debayer a Canon raw file, the RGB data are the same for the core of a saturated star (0.2335 and 15305 respectively).

When the raw data of an OSC camera are calibrated, the ratio of the color channels are indeed altered. This is due to the flat frame calibration, in which the dark-calibrated data are divided by the MasterFlat and mutiplied by the mean of the MasterFlat. "Mean of the MasterFlat" means that the average is computed for all channels together. So a channel that is weakly represented in the MasterFlat will be intensified and a channel that is strongly represented in the MasterFlat will be attenuated in the flat fielded data. This is completely correct, because the data are not yet color calibrated at this stage. Without a reasonable color calibration, the core of a saturated star will not appear white.

The color calibration has to be performed after integration of the calibrated, optionally cosmetically correted, debayered and registered data. After a corectly applied color calibration the core of a saturated star will be white.

Bernd
 

pfile

PTeam Member
Nov 23, 2009
5,099
41
there's got to be a reason for the pink color though. maybe it is because the G channel's SNR is 2x the other channels in the flat. but that doesn't explain why we never see blue stars when this happens; the cores are always pink.

i'm not convinced colorcalibration will take care of this problem because if the red channel is saturated and the others aren't, when the image is brought into the 16-bit space none of the channels seem to be saturated in that space. therefore it seems like the legitimate color of the pixel is pink when instead it should be bright red. ordinarily color calibration would exclude saturated pixels, but these saturated pixels don't appear to be saturated. if all 3 channels are saturated it should be OK, but if only one channel is saturated then you get problems.
 

bulrichl

Well-known member
Nov 2, 2016
692
34
La Palma, Canary Islands
I cannot confirm what you are saying.

I digged out an old integration (December 2017). A modified Canon 600D (= Rebel T3i) was used, 118 x 300 s = 590 min at ISO 800, target: NGC 2023 and surrounding.

This is after ImageCalibration (MasterDark, MasterBias, MasterFlat; dark frame optimization was applied), CosmeticCorrection, Debayer, StarAlignment and ImageIntegration. The edges are cropped in order to remove dithering edge artefacts. Finally I performed PCC. No further processing was applied. This is the result (linear image, STF with 'Link RGB channels' enabled): no pink cores of stars.

Bernd
 

Attachments

pfile

PTeam Member
Nov 23, 2009
5,099
41
i had this problem with wide areas of pink signal multiple times while processing the 2017 eclipse image. obviously in the brackets there are very many parts of the image that are overexposed. i can't say anything more except juan felt the need to mention pink star cores in the tooltip for the "no highlights clipping" control of libRAW.
 

bulrichl

Well-known member
Nov 2, 2016
692
34
La Palma, Canary Islands
I am using PixInsight also for daylight images with the DSLR camera. Then I have the RAW preferences set to 'Demosaiced RGB', and white balance to 'Camera white balance'. Pink areas only result in saturated regions when 'No highlight clipping' is enabled, but when you click on 'Demosaiced RGB', all 'Output options' are disabled, 'No highlights clipping' as well. These settings work for daylight images - no problem.

I don't remember that I ever had this issue with astro images so far. My settings for astro images are 'Pure raw', and that means, 'No highlight clipping' is enabled.

Bernd
 

macbates

Member
Apr 26, 2019
9
0
I have tried both, and am still in the process of figuring things out.

bulrichl: You’re correct; the raw files do not have RGB information (I should have known that!). I don’t know what I must have been looking at, but it certainly wasn’t a raw file. I did check again using PI, and the star in the raw file had a luminance value of 0.2335, just as you predicted.

pfile: I disabled the “no highlights clipping” in the raw preferences, and also removed the “raw cfa” hint in the debayer process, but the red centers of some of the stars remain in the single, linear image coming out of WBPP (just prior to final integration). Interestingly, if I do an auto-stretch to this image, that same star looks white, even though the RGB value is 0.99, 0.15, and 0.15.

Continuing through the post-processing steps, the red star centers appear when I do a masked stretch (which is what I have been using lately). If, on the other hand, I use the histogram stretch process, the stars are white as they should be.

A quick summary of what I’ve seen referencing one specific star among many:

Raw CR2 file Quick Look (Mac) or open in Preview (Mac): Star is white
Linear image after debayering in PI: Center of star is red
Linear image with auto stretch: Star center is white
Image after histogram stretch: Star center is white
Image after masked stretch: Star center is red
Tiff file stacked in DeepSky Stacker (linear): Stars are all white, whether auto stretched or not.

So, although the problem (or reason behind the problem) is not solved, at least I can bypass it by using the histogram stretch instead of a masked stretch when converting from linear to non-linear. Alternatively, I can skip WBPP and use the output of DeepSky Stacker as input to PI post processing. That’s fine for now since I’ve got lots of previous work to redo, but at some point I need to spend some time with the masked stretch process and see what I’m doing wrong.

Thanks to both of you for all the advice and tips. Still a ways to go, but I'm gradually getting there.

- Ken
 

pfile

PTeam Member
Nov 23, 2009
5,099
41
so it looks like the no highlight clipping did the expected thing with respect to the pixel values, but it had no effect. at this point i wonder if the stars that are saturated in that manner are actually red stars - if so then kinda no matter what you do you have lost information about those stars. one thing you can do is to get some shorter exposures where the red stars are not saturated (again assuming they are red stars) and use HDRComposition to replace the saturated data with unsaturated data. in this case though i am almost sure that it wont work right unless you enable no highlight clipping since the 0.25 saturated values will likely not be seen as bad data to be replaced when HDRComposition runs.

i think the fact that you are using masked stretch is what is revealing this problem. bernd may not have been using that type of stretch. in my case i was using all kinds of different stretching methods when working with the eclipse image i was talking about which might be why i saw the problem so readily.

does DSS do any kind of post-stacking normalization of images? like are you using the align RGB channels setting?
 

macbates

Member
Apr 26, 2019
9
0
I’m not sure what all DSS does; I just tried it to get another data point. It did have fairly bad banding, but the Canon Debanding script cleaned that up nicely. BTW, the stars aren’t all red. They’re mainly white, but have an annoying red spot right in the center. A lot of earlier images I’ve processed have those same problems; a few stars with a red spot right in the center.

I’m sure that it’s something in my workflow (possibly including data capture) that is causing this, so what I have to do is to keep trying different things and analyzing the results to try and figure out exactly what is happening and why. Once I figure that out, the solution should be obvious (I hope).

All this processing is really making me want a new Mac Pro with lots of memory and backing SSD storage to speed things up. On the other hand, given the current situation I have plenty of free time...

- Ken
 

bulrichl

Well-known member
Nov 2, 2016
692
34
La Palma, Canary Islands
@Rob:
You are right, I did not apply the MaskedStretch process.

@Ken:
When a pixel is saturated in a color channel, its intensity in the stretched image should be near 1.0 in the normalized real range. The problem you are confronted with is, that saturation occurs at different intensities for the three color channels when a correct color calibration with PCC is performed.

I guess the only possibility to get white cores of saturated stars is truncation. This should not be done in the linear stage, but probably as final step of the whole processing. Assume that in the color calibrated and stretched image saturation occurs at intensity r for the red, g for the green and b for the blue channel. Apply the following PixelMath equation with:

$T/min

where min is the minimum value of r, g and b. 'Rescale result' has to be unchecked in order to truncate results > 1.0.

Bernd
 

macbates

Member
Apr 26, 2019
9
0
Thanks for the PixelMath tip. I used it (after stretching), and it did mute things quite a bit; instead of red centers I now have smaller pink centers. I think the real problem is twofold:

1: The red channel/pixel of many of the stars in the integrated (WBPP output) images are saturated, and I really don’t understand way that is. I’ve checked the images I’ve taken on other nights and objects, and the same issue appears in all of them; many of the stars are saturated in the red channel, resulting in red star centers after processing.

2: The MaskedStretch process yields a very nice result, but those problematic stars have a red spot in the center. The AutoSTF and HistogramStretch processes do not result in red star centers, but the background is quite a bit noisier. I can probably could eliminate that by playing around with the HistogramStretch function after using the AutoSTF output. No idea why the HistogramStretch works better in this case, but it does allow me to get on with the processing.

My thought is that since many of the pixels in the raw (CR2) image are saturated, my exposure time is too long, and I may be able to solve the problem by using shorter exposures. As for the red channel being saturated, perhaps the sky/background noise is dominant in the red channel, so longer exposures mean that this is accentuated for any star that happens to be overexposed.

When and if I get a clear and moonless night I’ll take a series of exposure groups from 30 seconds or so up to 5 minutes, and then see how the problem stars look. Just because I can take a 5 or 10 minute exposure doesn’t mean I necessarily should, I guess,

Thanks again for all the suggestions; now that I know where the problem lies it’s time to see if I can gather better data to work with.