In working on a new image I have come across a problem that I haven't had before. I am ending up with dark black speckles in the image. I have done a comparative calibration with CCDStack and I don't have the same issue. I feel I must be doing something wrong.
The problem is only occurring in my color data
Here is my workflow.
- Create master bias from 42 frames
- Create master dark from 42 frames (-20C, 300 seconds)
- Create master flat, calibrating using above mentioned darks and flats, these three steps all in accordance with the settings recommended in the dark/flat tutorial
- Calibrate lights using the above master files
- Align all luminance, combine (no problem with end result there), downsample the aligned luminance using integer resample to be binned 2x2, align RGB with the downsampled luminance. I resample the RGB using nearest neighbor so every bad pixel ends up on one destination pixel rather than being spread out.
- Combine blue (for this example) using the following settings. Note that the RGB frames in this case are 200 seconds each and at -18c. I've had no trouble with the difference in time and temperature in the past.
- Combination: average
- Normalization: additive
- Weights: Noise Evaluation
- Pixel rejection: Averaged Sigma Clipping (I have 6 frames in this case)
- Normalization: Scale+Zero offset
- sigma low: 1.7, sigma high: 1.7 (I'm expecting it to have outliers)
In my integrated image, I get dark pixels that are less than 50% of the value of those around them. I tried this same workflow in CCDStack (which uses dark adaptation method they call RMS) and did not have the dark pixels. Sample images are attached.
The underlying images are all dithered.
I feel I must be doing something wrong since I have not had this trouble with similar data sets.
Thanks in advance,
--Andy