I shoot astrophotos with a 14-bit DSLR. My raw images are therefore constrained to 0 - 0.25 in PixInsight data space. My exposures and ISO are set to bring my histogram humps up about 10% from the left side, so my photo noise is separated from my camera noise.
When I do this, there are usually 20 or so bright stars in the image whose cores are saturated. Right after debayering I see magenta cores with R, G, or B pixel values well above 0.25. So saturation and wonky data are happening way early in the process.
Because the saturated cores are artificially well over 0.25 due to debayering, they limit the dynamic range available during stretching. I want to limit the saturated cores while still linear so stretching can work with its optimum dynamic range.
I've tried the Repaired HSV Separation script, and it helps but does not work very well. It still leaves saturated cores with artificial data. It also puts a break in the image history since it creates a new image rather than updating the original.
I've also done HDRComposition using subs with a wide range of exposures from 100 s down to 3 s. This works better, although there are still saturated cores at 3 s. It's also a big pain. The results are nice though: stars which simply show their color almost to the center of their core.
What I want is to merge the linear pixel data from just outside the saturated cores inward to fill the saturated core. This would result in just about the same thing HDRComposition does, but would be a lot easier and less time consuming. It would also work with images where HDR short-exposure subs weren't taken. I can't find any PI processes or scripts to do this, though, and can't think of a way to coax it out of the ones I find.
Any ideas? Thanks for any help.