Right... I just stumbled onto the fact that images needed to be stretched, else the colors become very saturated and bizarre. However, now that I have stretched the individual channels and produced an LRGB composite, how do I color calibrate the image? Isn't it a bit too late in the nonlinear domain?
Perhaps that is what is intended by the channel scaling in LRGBCombination? And if so, how does one derive a proper set of scale parameters?
As it happens, with an LRGB dataset at hand, the colors come out nearly reasonable, but with a bit of green noise. Seems like a lucky outcome, but perhaps it was designed to be that way when using "matched" filter sets?
The above comments seem to apply only to HD data, not to standard precision. In the case of standard precision data sets, the LRGBCombination seems to produce very nice results that can then be color calibrated in the usual manner. Something unique about double- versus single-precision float data.
------
Hmm... interesting... I can perform normal color calibration with BGNeut and ColorCal against the LRGBCombo of stretched HD frames. It seems to work just fine.