I don't think I misunderstood anything, too old, and wise, for that. I'll just leave it at that.
So, what do you think is the reason for your questioning of the 10.9 hours in a Bortle 3? What are you seeing, or not seeing, that makes you think there's a lack of data, or the data is weak?
no, you really have misunderstood my motivation. i'm only saying that something seems wrong to me. i'm not attacking your skills, or your person, or anything else. you say you are relatively new at this, so some kind of software config problem or equipment problem is much more likely than you fibbing about how you acquired the data.
again, i wasn't really questioning that it's 10.9 hours from bortle 3. i'm saying, to my eye, it doesn't look like it. and the signal in the subexposure does seem low to me, so this makes me wonder if something else is wrong here.
somewhere on my other computer i have a comparison of a single 1800s L frame from a dark site and 1800s of integration from bortle 8 on the same target. the difference is absolutely staggering. you almost don't need to stack anything from the dark site.
With all your experience, can you give me some suggestions on how I can improve things? Is more integration time needed? Would exposures longer than 120 seconds help? Would a different filter help, or removal of the uv/ir cut filter I'm currently using? Is it perhaps the way the integration was done in Pixinsight? Could it be the fact that there was a bit of moon on a couple nights?
that's what i'm saying. i don't think more integration time should be needed on this target from that location. mainly the important thing in any subexposure (actually the only thing) is to swamp the read noise of the camera. on a CCD camera, there aren't too many parameters other than the subexposure length that the user can vary to make sure of this. on a CMOS camera, the gain is relevant to this problem. i don't use a camera with that sensor, but i know that there are probably "canned" gain and offset values that people tend to use for broadband and narrowband. if you've already checked that off the list then the only thing to do is calculate if your subexposure is long enough to swamp the read noise of the camera at the gain you are using.
i don't think you'd want to remove that filter as fred has shown that it does not cut the Ha signal. that is why i asked if you were using the DSLR - the image looks to me like an image taken with an unmodified camera where most of the Ha signal is attenuated before it hits the sensor. and also the UV cut filter is probably helping you with blue bloat since the OTA is apparently a doublet. i guess you *could* try without the filter to find out if the filter's passband is as advertised. filters being mis-labeled or having quality control problems is definitely something i've seen over the years. some filters also have a directionality, if only because they have an antireflective coating on one side but not the other.
yes the moon can mess up your target SNR as you end up capturing mostly skyglow. how far was the moon from the target and what phase?
as far as PI's integration is concerned, really the only things going on are how the frames are normalized, how they are weighted and whether or not pixel rejection was used. WBPP's weighting algorithm is pretty fancy, so that is probably OK. if you don't have any gradients in your image (and from that location, absent the moon, you shouldn't), then the "regular" normalization built into ImageIntegration is likely fine.
the main thing to check with ImageIntegration is that your pixel rejection was not too aggressive. you can integrate with and without rejection and compare noise (with the NoiseEvaluation script). a marked difference can indicate that too many "real" pixels were rejected. or, you can just look at the rejection maps coming out of II - if you see 'structure' in the maps (like the shape of the nebula) then the rejection is possibly too aggressive. having said that if you have some number of frames that are real outliers in brightness or were somehow calibrated improperly, you might see structure in the rejection maps (especially the low map) which is really just the result of some number of frames that just didn't normalize properly or had bad calibration.
short of having all the raw subs and running them all myself, i don't know what else i can suggest at this point.
rob