Hi guys,
I processed the L differently from the RGB. The L suffered more from light pollution than RGB.
For the L:
1) Once (based on our discussion here) I discovered that I needed to use different flats (taken a week apart) for the Lums that were taken during that week, just about all of the dust motes disappeared. Only the various light pollution gradients remained.
2) The very densely packed DBE suggested here was used on the L stack (and later on the RGB stacks). Of course I took the time to make sure no point was on a star or galaxy. I've never used such a densely packed DBE before, I've seen recommendations that less is better, however this really helped with the gradients.
3) After noise reduction on L (TVGDenoise, MMT), I was left with some "rivers" of darker background areas amidst the mostly even background. I've had this happen before after noise reduction with our light polluted skies. So I applied a noise floor to the background. I analyzed the K value in the even portion of the background, created a new sub with this gray level, added gaussian noise approximating the remaining noise level in the L stack, and applied in PixelMath using a Max() function (protecting galaxies and stars as their signal was brighter than the background). This eliminated the darker background rivers. Of course this technique wouldn't be very useful if there was nebulosity or IFN in the background, but in our case there was neither to worry about. Having a more level background level allowed a better stretch. If you try this technique, you have to be very careful in PixelMath not to overwrite signal. Here is a basic formula (I was working with values for the multiplier to the 10,000th decimal place to fine tune the exact moment the noise floor was too much): Max(Lum_image, NoiseFloor_image * .0092).
Here is a sequence showing you three images. Left (super stretched) shows the dark rivers after standard PI denoising. Center (super stretched) shows the same image after adding the noise floor. Right (standard stretch) shows the result after one more round of denoising in PI. This image was now ready for histogram stretching.
4) Stretched three times, once each for core, middle and outer arms of the galaxy, then used masks in Photoshop to combine them, bringing out high dynamic range of the galaxy.
Note: I had originally employed deconvolution before HistT, however deconvolution created too much of a contrast within the galaxy dust lanes. I'm guessing that this was due to the small size of the galaxy, there wasn't enough pixel range to create a smooth transition in deconvolution - I ended up with posterized dust lanes that looked like a cartoon. So I left out linear sharpening and did targeted sharpening at the end of all processing using Franzis (Photoshop plugin).
Here is a before/after deconvolution (this posterization appeared with even as few as 10 iterations using a PSF):
For RGB, I didn't need to worry about noise floors. DBE (densely packed) and noise reduction did a decent job. However because this was such a low surface brightness object, and in our worst direction (North), the noise was more than standard denoising in PI could handle (at least in my hands). Pushing through, I was left with an RGB combination where noise was causing terrible stray colors within the galaxy. Nonlinear denoise couldn't take care of this. So instead I stretched without first denoising. This left much more natural and smoother colors in RGB and the noise was much more even colored. I took this to Photoshop where I used Franzis denoise, and this really cleaned it up. Back to PI to push saturation, then combine with the L and finish processing.
Note: I originally used standard color calibration (backgroundneutralization, colorcalibration, SCNR), but later I went back and tried PhotometricColorCalibration. Photometric worked a lot better on our image.