Author Topic: LRBG v. RGB again  (Read 24806 times)

Offline jkmorse

  • PixInsight Padawan
  • ****
  • Posts: 931
  • Two questions, Mitch . .
    • Jim Morse Astronomy
Re: LRBG v. RGB again
« Reply #15 on: 2013 September 22 00:45:30 »

In my case, the only drawback for separate Lum filter is it's terrible under heavy light pollution. I use Astrodon LRGB filters and there is a "hole" in RGB filters to filter out light pollution. I also image with Astrodon Ha filter and blend Ha with Red and make HaRGB images instead of LRGB. I am never successfull imaging with Lum filter under heavy light pollution.

I guess it depends on the environment in your area whether Lum filter works for you or not.

I feel your pain on the question of using a Lum filter in heavy light pollution.  I live in Dubai currently and in Houston when in the States and both are aweful.  My choice is to drive (4.5 hours in Dubai, 90 minutes in Houston) to a more suitable location.  In the light, your only real option for any kind of clarity is to shoot narrowband.  I have imaged using Ha, OIII and SII astrodon filters under a full moon with no problems.  Colors are "unnatural" though there are lots of articles about using two color NB images to build a final image with more natural colors.  You have my sympathies brother.
Really, are clear skies, low wind and no moon that much to ask for? 

New Mexico Skies Observatory
Apogee Aspen 16803
Planewave CDK17 - Paramount MEII
Planewave IFR90 - Astrodon LRGB & NB filters
SkyX - MaximDL - ACP

http://www.jimmorse-astronomy.com
http://www.astrobin.com/users/JimMorse

Offline Geoff

  • PixInsight Padawan
  • ****
  • Posts: 908
Re: LRBG v. RGB again
« Reply #16 on: 2013 September 22 04:34:26 »
Re narrownand.  I find R,G,B->Ha,OIII,OIII works quite well in giving natural colour.
Don't panic! (Douglas Adams)
Astrobin page at http://www.astrobin.com/users/Geoff/
Webpage (under construction) http://geoffsastro.smugmug.com/

Offline Juan Conejero

  • PTeam Member
  • PixInsight Jedi Grand Master
  • ********
  • Posts: 7111
    • http://pixinsight.com/
Re: LRBG v. RGB again
« Reply #17 on: 2013 September 23 03:50:51 »
Quote
do you know a written Pixinsight workflow to do a [LRGB]-RGB as stated by Mischa?

In general, the unweighted average of individual RGB components does not lead to an optimal result in signal-to-noise ratio terms. An optimal synthetic luminance must assign different weights to each color component, based on scaled noise estimates.

This process can be done very easily with the ImageIntegration tool, as I'll describe below. However, for the sake of understanding how things work, I'll describe the algorithm first. Let L, R, G, B be the luminance image and the three color components of the color image, respectively, and let's take L as the reference image for normalization. First we compute the scaling factors:

kR = sL/sR
kG = sL/sG
kB = sL/sB

where sL, sR, sG and sB are scale estimates (or estimates of statistical variability or dispersion) for the luminance, red, green and blue components, respectively. Robust and efficient scale estimators play an important role here.

Now we need noise estimates, which we can compute easily with the NoiseEvaluation script. Call the unscaled noise estimates nL, nR, nG and nB, respectively. Now we can compute the optimal combination weights:

wL = 1/(nL)2
wR = 1/(kR * nR)2
wG = 1/(kG * nG)2
wB = 1/(kB * nB)2

We can convert the set of weights into the components of a unit vector:

W = wL + wR + wG + wB
rL = wL/W
rR = wR/W
rG = wG/W
rB = wB/W

Define zero offsets:

dR = pL - pR
dG = pL - pG
dB = pL - pB

where pi are robust estimates of location (typically, the median of all pixel samples in each image).

The SNR-optimized synthetic luminance is given by:

Lopt = rL*L + rR*(R + dR) + rG*(G  + dG) + rB*(B  + dB)

This process can be done with PixelMath and the JavaScript runtime, but it is tedious. However, it is very easy to implement with the ImageIntegration tool. If you work with an OSC camera you have to split your color image first with the ChannelExtraction tool, and save the individual R, G and B images as FITS files. Open ImageIntegration and select the four files. Then leave all tool parameters by default (you can click the Reset button to make sure) and click the Apply Global button. The relevant parameters are as follows:

- Combination = Average
- Normalization = additive with scaling
- Weights = Noise evaluation
- Scale estimator = iterative k-sigma
- Generate integrated image = enabled
- Evaluate noise = enabled
- Pixel rejection = No rejection
- Clip low range = disabled
- Clip high range = disabled

You can make several tests with different scale estimators and select the one that yields the highest noise reduction. The integration result is the optimal luminance image that you can treat in the usual way (deconvolve it if appropriate, stretch it to match the implicit RGB luminance, combine with LRGBCombination, etc).

Note that the same procedure can be used to compute an optimal luminance for an RGB image, that is without an additional L image. In this case just use one of the RGB channels as the reference for integration.

[Edited to fix an error in the equation for combination weights (missed the squares in the denominators - necessary since we are minimizing mean square error)]
[Edited to include zero offset normalization]
« Last Edit: 2013 September 24 02:31:10 by Juan Conejero »
Juan Conejero
PixInsight Development Team
http://pixinsight.com/

Offline jkmorse

  • PixInsight Padawan
  • ****
  • Posts: 931
  • Two questions, Mitch . .
    • Jim Morse Astronomy
Re: LRBG v. RGB again
« Reply #18 on: 2013 September 23 08:10:06 »
Geoff & Juan,

[/quote]
Well unfortunately there is always this tradeoff---save time, lose quality--spend (waste?) time, gain quality.  See these posts by Juan
http://pixinsight.com/forum/index.php?topic=1137.msg5592#msg5592
http://pixinsight.com/forum/index.php?topic=1636.msg9297#msg9297
[/quote]

Ok, I am willing to try since when I shoot NB I certainly don't need a separate Lum image, just the Ha, OIII and SII stacks.  And the posts you quote seem to support just ignoring the Lum all together and just working in RGB space.  But then why all the talk about the need to create a synLum.  That just seems to recreate all the problems that Juan was arguing against. 

Confused but willing to learn.

Jim
Really, are clear skies, low wind and no moon that much to ask for? 

New Mexico Skies Observatory
Apogee Aspen 16803
Planewave CDK17 - Paramount MEII
Planewave IFR90 - Astrodon LRGB & NB filters
SkyX - MaximDL - ACP

http://www.jimmorse-astronomy.com
http://www.astrobin.com/users/JimMorse

Offline Juan Conejero

  • PTeam Member
  • PixInsight Jedi Grand Master
  • ********
  • Posts: 7111
    • http://pixinsight.com/
Re: LRBG v. RGB again
« Reply #19 on: 2013 September 23 09:38:21 »
Hi Jim,

I still prefer RGB to unbinned LRGB. However, what is being discussed here is how to optimize the luminance in terms of SNR maximization. This is interesting because the human vision system detects image details mainly through the lightness. The contribution of color differences to detail detection is comparatively very small.

Our goal here is to replace the implicit luminance of the linear RGB image with a linear combination of its individual RGB components that minimizes noise. To solve this optimization problem, we must find the coefficients (or weights) that multiply each channel in the solution. Then we use that linear combination as if it were an independent luminance image, and proceed as we do in a normal LRGB workflow. The resulting nonlinear RGB image will have less noise perceptually, since its lightness component will have the highest possible SNR with the available data. It can be shown that the total SNR of the image does not change; it is just the way noise is distributed what varies. This is a form of apparent noise reduction based on the roles that color and brightness play in our vision system.

When the same method is applied to an LRGB set, we find an optimal combination of the four available components. I am seriously considering writing a new SyntheticLuminance tool to automate these processes...
Juan Conejero
PixInsight Development Team
http://pixinsight.com/

Offline mschuster

  • PTeam Member
  • PixInsight Jedi
  • *****
  • Posts: 1087
Re: LRBG v. RGB again
« Reply #20 on: 2013 September 23 16:47:46 »
Juan,

I may be mistaken but it appears you suggest integrating L, R, G, and B frames ignoring their spectral bands, as if they were all monochrome, with no rejection.

Typical RGB to luminance conversions use spectral weighting G > R > B. Why are you not including some form of such a weighting?

Also, I believe additive with scaling does location normalization, but your description does not include a location offset, is this correct?

Thanks,
Mike

Offline Geoff

  • PixInsight Padawan
  • ****
  • Posts: 908
Re: LRBG v. RGB again
« Reply #21 on: 2013 September 23 21:39:09 »
The resulting nonlinear RGB image will have less noise perceptually, since its lightness component will have the highest possible SNR with the available data. It can be shown that the total SNR of the image does not change; it is just the way noise is distributed what varies. This is a form of apparent noise reduction based on the roles that color and brightness play in our vision system.

Glad to see this explanation.  I always thought that using the same data twice (synthetic lum from RGB) to get a free lunch seemed to good to be true.  Now I see that we still pay for the lunch, but its arrangement on the plate is different.
Geoff
Don't panic! (Douglas Adams)
Astrobin page at http://www.astrobin.com/users/Geoff/
Webpage (under construction) http://geoffsastro.smugmug.com/

Offline pfile

  • PTeam Member
  • PixInsight Jedi Grand Master
  • ********
  • Posts: 4729
Re: LRBG v. RGB again
« Reply #22 on: 2013 September 23 22:27:08 »
Juan,

I may be mistaken but it appears you suggest integrating L, R, G, and B frames ignoring their spectral bands, as if they were all monochrome, with no rejection.

Typical RGB to luminance conversions use spectral weighting G > R > B. Why are you not including some form of such a weighting?

Also, I believe additive with scaling does location normalization, but your description does not include a location offset, is this correct?

Thanks,
Mike

IIRC juan has explained before that L* is infact perceptually weighted as it is tied to human vision, but Luminance (in the astronomical context) is not (and should not be). in fact if you attempt to make a non-noise weighted synthetic Lum image from an RGB image, you should first set the RGB weights to 1:1:1 and set the gamma to 1 using the RGBWorkingSpace process. otherwise the extracted L* would be exactly as you describe.

rob

Offline Juan Conejero

  • PTeam Member
  • PixInsight Jedi Grand Master
  • ********
  • Posts: 7111
    • http://pixinsight.com/
Re: LRBG v. RGB again
« Reply #23 on: 2013 September 24 02:08:15 »
Quote
you suggest integrating L, R, G, and B frames ignoring their spectral bands, as if they were all monochrome, with no rejection.

Exactly.

Quote
Typical RGB to luminance conversions use spectral weighting G > R > B. Why are you not including some form of such a weighting?

Because the function that I am calculating has no perceptual meaning. It is actually a function of luminous intensity. In reality it has no direct physical meaning, either; it is just a purely numerical solution to an optimization problem (minimize mean square error in the combination of individual RGB channels). As a side note, the terms luminance and brightness are often used very loosely in astronomical imaging.

Once we have an optimal combination of RGB (or LRGB) components, we delinearize it to replace the lightness (CIE L*) component of the stretched RGB image (with the LRGBCombination tool). By doing this replacement we are conferring a perceptual meaning to our optimized combination (this is precisely why we get the apparent noise reduction), so it now works as an "optimized lightness" component, and hence calling it luminance when it is in its linear form is not completely incorrect IMO.

Quote
Also, I believe additive with scaling does location normalization, but your description does not include a location offset, is this correct?

Your're right, I forgot to include zero offset in my description. I'm going to fix this right now, thanks for pointing out.
Juan Conejero
PixInsight Development Team
http://pixinsight.com/

Offline jkmorse

  • PixInsight Padawan
  • ****
  • Posts: 931
  • Two questions, Mitch . .
    • Jim Morse Astronomy
Re: LRBG v. RGB again
« Reply #24 on: 2013 September 24 02:27:03 »
Geoff & Juan,

[/quote]
Glad to see this explanation.  I always thought that using the same data twice (synthetic lum from RGB) to get a free lunch seemed to good to be true.  Now I see that we still pay for the lunch, but its arrangement on the plate is different.
Geoff
[/quote]

I want to follow up the "something for nothing" thread Geoff raised above.  I am a lawyer (I know, I know, even worse, I work for big oil), not a numbers guy, though I try and am facinated by the theory.  Putting aside for the moment the synthetic lum issue, when I change my imaging workflow from LRGB with binned RGB stacks to RGB unbinned, I will likely shoot 3 hours of each.  When I was shooting 5 hours of Lums, even I could figure out that I was getting the benefit of 5 hours of lum data for SNR purposes.  But when I shoot three sets of unbinned RGBs at 3 hours each, am I getting 3 hours of imbedded Lum data for SNR purposes, or 9 hours, or something in between. 

Thanks,

Jim
Really, are clear skies, low wind and no moon that much to ask for? 

New Mexico Skies Observatory
Apogee Aspen 16803
Planewave CDK17 - Paramount MEII
Planewave IFR90 - Astrodon LRGB & NB filters
SkyX - MaximDL - ACP

http://www.jimmorse-astronomy.com
http://www.astrobin.com/users/JimMorse

Offline Juan Conejero

  • PTeam Member
  • PixInsight Jedi Grand Master
  • ********
  • Posts: 7111
    • http://pixinsight.com/
Re: LRBG v. RGB again
« Reply #25 on: 2013 September 24 04:00:56 »
Hi Jim,

I would say that you are gathering the equivalent to less than 3 hours of clear luminance with 3 hours of RGB, perhaps much less, depending on the filters you use. Each RGB filter has a limited bandwidth, so the total flux that you are accumulating is just a fraction of the accumulated flux with an L filter, or with no filter at all. That's why the people make unbinned luminance exposures: to accumulate more light in less time, to achieve more depth in their images.

The extra depth comes at a cost, however: more luminance implies less chrominance, so there will always be structures without good chrominance support (that is, grayscale structures, or structures where color saturation is zero or very low) and the chrominance in general will be a comparatively low-SNR data set. It is like a precision balance: each bit of luminance that you add to the balance (=image) requires the corresponding bit of chrominance to support it, or the balance will be unbalanced (=the image will lack color saturation, especially on the dimmest structures).

To palliate the above unbalancing problem, which is actually a SNR problem, a smart trick is shooting binned RGB. By binning color exposures one can accumulate four times more signal in the same time, which allows to shorten subexposures (which means less tracking and flexure problems, etc) and compensate for the large amount of signal accumulated in the unbinned luminance. Again, this comes at a cost: lack of spatial resolution in the chrominance. This isn't as bad as it sounds because, as we have seen, we perceive most of the detail through the luminance. However, it is bad because as a result of the lack of resolution, the image will not provide good chrominance support to small luminance structures, which again will appear unsaturated. In addition, an optimized luminance as we have described in this thread is inviable with binned RGB .

None of these problems happen with pure RGB images, since the luminance and the chrominance are balanced naturally. The lack of depth due to limited filter bandwidth, as well as the lack of chromatic richness due to non-overlapping filter response curves (a typical problem of the typical RGB CCD filters), can be fixed using the necessary subframe exposure times, more total integration time if necessary, and better filters with wider bandwidths and some overlapping. An example is this image of NGC 7331, where Vicent Peris used photometric filters:

http://pixinsight.com/examples/NGC7331-CAHA/en.html

I strongly recommend reading Section 2 of the above document. Another example is this image of M51, acquired through Baader Planetarium filters:

http://pixinsight.com/gallery/M51-CAHA/en.html

Both are RGB or RGB+Ha images.
Juan Conejero
PixInsight Development Team
http://pixinsight.com/

Offline jkmorse

  • PixInsight Padawan
  • ****
  • Posts: 931
  • Two questions, Mitch . .
    • Jim Morse Astronomy
Re: LRBG v. RGB again
« Reply #26 on: 2013 September 24 04:23:12 »
Juan,

Thanks for setting me straight. 

Jim
Really, are clear skies, low wind and no moon that much to ask for? 

New Mexico Skies Observatory
Apogee Aspen 16803
Planewave CDK17 - Paramount MEII
Planewave IFR90 - Astrodon LRGB & NB filters
SkyX - MaximDL - ACP

http://www.jimmorse-astronomy.com
http://www.astrobin.com/users/JimMorse