(1) I'm having a hard time understanding what's happening here.
RGB data sets, all 3 uncalibrated images roughly the same minimum sky background levels (R about half B or G, B and G about the same). After calibration, about the same ratios, nothing looks odd. I integrate using LocalNormalization (AdditiveWithScaling has the same effect) and my integrated Green image is now 10% of the R and B (i.e., G background is .004 while R and B are closer to .030. Stats shows about the same standard deviation in all integrated images but the average G is again 10% of the B and R images.
If I disable normalization, the integrated G image is now similar to the B and R. I've read the description of the processes but why I'm seeing such a huge scaling eludes me. FWIW, my dataset was only some test images (15x60s of M81/82), not nearly enough and transparency wasn't all that good either but I'm at a loss to understand the results I'm getting. Any help appreciated!
2nd question: I thought I had a handle on flats until I went from a long focus RC to a short focus refractor and a new camera. Looks like my previous technique needs some adjustment but my question is this: I've always abided by the conventional wisdom 1/3-1/2 the sensor's linear range but I'm wondering should a flat be in some way matched to the average image level, or average background, etc? Something like that rather than some arbitrary level? I've made flats using a flat panel at 3 different exposure levels for each filter and when I use the results to calibrate an existing dataset, I can get very different results. I'd appreciate some advice...