New script for denoising linear monochannel images

I wrote a new script that estimates detector settings for use by MureDenoise.

It will appear soon as an auto update as Scripts > Noise Reduction > MureDenoiseDetectorSettings.

The new script is very easy to use and fast. You supply two uncalibrated flat frames and either two bias or two dark frames.  The use of two dark frames rather than two bias frames allows the script to account for dark current noise.

The script provides estimates for detector gain, Gaussian noise, and offset for use by MureDenoise.

The script also provides a flat frame exposure estimate. As a double check on exposure, IMO you typically want to see value between roughly 30% and 70% of detector full-well in e-, to provide sufficient SNR and avoid problems with near saturation nonlinearity.

MureDenoiseDetectorSettings.png

 
mschuster said:
MureDenoise 1.25 should appear soon as an auto update.

Fixed bug in Use image metadata due to a typo.

Increased maximum Cycle-spin count to 48.

The text Use image metadata [none] indicates either unavailable image metadata or unsupported ImageIntegration options.

does LocalNormalization count as an unsupported ImageIntegration option? if so, what should be done in this case? not sure what to configure and where when you uncheck the 'use image metadata' checkbox.

rob
 
pfile said:
does LocalNormalization count as an unsupported ImageIntegration option? if so, what should be done in this case? not sure what to configure and where when you uncheck the 'use image metadata' checkbox.

Hi Rob,

LocalNormalization is unsupported. Its pixel-wise modifications are difficult to model. Have you been using it previously with MureDenoise?

You have two options now:

1) Uncheck Use image metadata, set Variance scale to 1 and then adjust as needed.

2) Open File > FITS Header on your image, search for the keyword entry HISTORY > ImageIntegration.outputNormalization: Local, replace "Local" with "Additive + scaling", apply the change, and save. Then check Use image metadata, set Variance scale to 1 and then adjust as needed.

Either way, if you get an acceptable result, please let me know. Also, if you have been using LocalNormalization previously with the script please let me know.

Edit: Rob, please place your image in Dropbox so that I can take a look at its metadata. I am looking now at PCL II to see what it actually does, but I would like an example to double check. Thanks.
 
i have probably not been using it with LocalNormalization; i usually don't use LN but i have a dataset with a very difficult flattening problem and so i tried it out.

if "use image metadata" is checked, does the script ignore the Variance Scale setting? it seems like it doesnt, so does it treat the value 1 as a special case to internally compute a different value?

i will upload the image.

rob

edit: https://drive.google.com/file/d/1RzE9i1boGHdsK65VJyFALyRD2HRBNvhw/view?usp=sharing
 
pfile said:
i have probably not been using it with LocalNormalization; i usually don't use LN but i have a dataset with a very difficult flattening problem and so i tried it out.

if "use image metadata" is checked, does the script ignore the Variance Scale setting? it seems like it doesnt, so does it treat the value 1 as a special case to internally compute a different value?

i will upload the image.

Ok, thank you Rob. Please also post your detector settings and interpolation method. I would like to try denoising.

And value 1 is not a special case. It always represents  the "nominal" amount of noising appropriate for the option settings you have chosen. You can always adjust it up or down to get more or less denoising.

With "use image metadata" option checked and metadata available, the script always applies an additional transformation to the image that improves denoising accuracy. Basically this transformation accounts for the various things ImageIntegration did to the frames and the final result. When not checked, this additional transformation is not applied.

The same holds true for "Include gradient classifier". If checked, the script applies an additional stage that further improves denoising accuracy, but it runs noticeably slower.
 
OK thanks for the clarification, now i understand a little better.

on the interpolation i'm not sure, it's set to auto in StarAlignment and MureDenoise seems to always be set to Lanczos-3.

my sensor params are:

gain: 0.379
gaussian noise: 30.36
offset: 0

i did run muredenoise on this image and the results seemed pretty good. i did have to crank the variance scale all the way up to 2.4 though.

rob
 
Thank you Rob, the image noise level seems ~2x high for those detector settings. Individual frame noise levels (recorded in the metadata) also seems ~2x high.

Do you have two recent bias frames and two recent flat frames? If so, and if possible, please run my new script MureDenoiseDetectorSettings and tell me what you get.
 
i do have new bias, darks, and flats taken just the other night. i'll try the new script.

some time ago i had characterized the sensor using your older scripts but admittedly it's been a couple of years now. and i am definitely seeing bias/dark drift as i've tried to calibrate some recent images with older bias/darks and clearly the calibrated result is bad.

rob

edit: attach screenshot of MureDenoiseDetectorSettings. i guess i need to be more careful with sensor characterization. DarkBiasNoiseEstimator gives the same result for the same 2 bias frames and a result of 28.67DN for 2 1800s darks taken the other night.
 

Attachments

  • Screen Shot 2019-11-26 at 5.47.21 PM.png
    Screen Shot 2019-11-26 at 5.47.21 PM.png
    84.2 KB · Views: 63
Thank you Rob,

The script?s estimates are extremely close to manufacturer specs: 0.37 e-/DN, 25.1 DN.

It may be LN is changing noise statistics of your image, or possibly something else in your preprocessing accounts for this change.

Given this uncertainty, I want to be conservative and say MureDenoise won?t work reliably for LocalNormalization in general.

However it is completely OK to try it anyway to see if you can get something useable.
 
Hi Rob, update:

The median of your frames you integrated range between about 1600 DN and 2400 DN.

The median of the integrated result is about 160 DN.

This change in median appears to be due to LocalNormalization.

This change is definitely what is giving MureDenoise difficulty.

Basically there is more shot noise in the integrated sky background (from the high median frames) than there should be given its apparent low intensity level. MureDenoise decides this fluctuation must be due to signal, since it is larger than the expected noise level. Hence it was not removed when you tried variance scale 1.

Bottom line: Using MureDenoise with LocalNormalization is probably not a good idea.
 
thanks for the analysis. i was wondering what was wrong. my flow here is very convoluted, so it's possible i've done something wrong along the way.

if i don't use the LN frames then muredenoise does very well using the image metadata and a variance scale of 1. actually i need to dial it down to about 0.75 to keep it from being too smooth.

as per usual, my data is absolute crap - bortle red zone and RGB imaging don't really go together but i keep on banging my head against it for some reason. normally i cull pretty aggressively with subframeselector but this time i just ditched the obviously awful frames only using blink and integrated the balance. it is very likely that some of those input frames are just bad; i need to go back with SFS and see if there are a bunch with super-high background (it certainly sounds like there are given your analysis.)

also, i continue to suffer from some problem which makes flattening these high-LP frames very difficult - i am always left with large-scale circular artifacts in the integrated image. they are large enough that they could be reflections off my flattener as the size would indicate whatever it is is about 3" from the sensor. furthermore, my camera apparently failed and i removed it from the OTA before obtaining flats; despite getting it working again and putting it back in the same orientation it is possible the flats simply don't match anymore.

anyway because i think i've exhausted all other options (sky flats vs. panel flats, flocking), i started playing around with fabian neyer's method of sky-correcting panel flats. what i ended up doing was just integrating all my images together without registration, aggressively rejecting, and then doing DBE to that integrated image, then removing scales 1-128 from the background model a couple of times and multiplying the flat by the smoothed model. when i did this to the blue channel i did not rescale anything and the resultant modified master flat is very dim. so this might have had the effect of brightening up the calibrated images; not sure. in theory the flat scaling during calibration should have undone this dimming.

although this resulted in a reasonably decent registered/integrated image, it did still need DBE, probably due to my "sky model" being innaccurate. the DBE was successful, and at some point it dawned on me that i could use the DBE'd image as a reference for LN and fix the original subs straight away without modifying the flat. when i did this, the SNR of the LN'd master seemed about an order of magnitude greater than the non-LN master so i concluded that this was the right flow. however, on the R channel i did not observe this effect. one difference on the R channel is i started rescaling the multiplication of the flat by the smoothed DBE model. i also got dark artifacts around some stars @ LN scale 128. moving to 256 cured that. finally as juan has stressed running NoiseEvaluation on two different images like that yields an invalid comparison, so the SNR was probably not a good metric to use to compare the two masters.

anyway this is all a very long-winded way to say that i've absolutely beaten the hell out of this data so it may be beyond what you'd normally see with LN in terms of bad quality.

rob


 
Hi Rob,

Thanks for the info.

I tried LN on some frames, generating the actual normalized frames so that I could look at them.

I did not see a large change in median. Medians were changed, scales also, to roughly match those of the reference. A result which seems reasonable to me.

So it may be that the large median change in your project was due to some other processing.

But even so with these more moderate results, MureDenoise can't/won't account for LN processing. So it remains a not good idea for use with MureDenoise, at least to get "accurate" results.

MureDenoise strongly relies on pixel intensities to determine expected noise levels (ie, combined shot and sensor noise, parameterized by detector settings). Anything that tweaks pixel values in ways MureDenoise is not aware of will cause trouble.
 
Hello Mike,

If we are to apply LN to the data, we can use MureDeNoise on the individual subframes right after calibration, prior to registration, LN and stacking, can't we ?

I can imagine that this would be a very time consuming processing but if this is the only way to get the best out of the data at hand, I think we can all bear with it.

How well CosmeticCorrection plays with Mure Denoise? Should we eliminate this step if we know we gonna use MD ?
 
Hello Sedat,

Sedat said:
If we are to apply LN to the data, we can use MureDeNoise on the individual subframes right after calibration, prior to registration, LN and stacking, can't we ?

Don't denoise frames, as doing risks compromising SNR. Weak signals, that are buried in noise in the frames, but appear as signal above the noise in the integration, might be removed.

Sedat said:
How well CosmeticCorrection plays with Mure Denoise? Should we eliminate this step if we know we gonna use MD ?

I have not tried CosmeticCorrection. For my projects, pixel rejection during integration solves the problem on my long exposures. My guess is that CosmeticCorrection is OK, especially for isolated pixels. But for bad rows/cols please double check MureDenoise results.
 
mschuster said:
Hi Rob,

Thanks for the info.

I tried LN on some frames, generating the actual normalized frames so that I could look at them.

I did not see a large change in median. Medians were changed, scales also, to roughly match those of the reference. A result which seems reasonable to me.

So it may be that the large median change in your project was due to some other processing.

But even so with these more moderate results, MureDenoise can't/won't account for LN processing. So it remains a not good idea for use with MureDenoise, at least to get "accurate" results.

MureDenoise strongly relies on pixel intensities to determine expected noise levels (ie, combined shot and sensor noise, parameterized by detector settings). Anything that tweaks pixel values in ways MureDenoise is not aware of will cause trouble.

no idea what changed the median then since the image was not processed, but i did run DBE so maybe that messed things up. or maybe my integration reference was bad, as mentioned i didn't really cull any frames or choose a good reference.

at any rate on this project i think i have decided the results are OK without LN so i will go back and reintegrated the "corrected" Blue subs and move forward from there. the MureDenoised LN image is definitely weird, i guess owing to the low median.

rob
 
pfile said:
it dawned on me that i could use the DBE'd image as a reference for LN

Rob, OK, a followup:

Maybe you used a LN reference image that was not a frame and also not included in the actual integration?

If so, what is its median?
 
it's been a couple of days since i did this so i am not sure if i still have the image that i used as a reference; not sure i can definitively give the median (unless the .xnrml files contain a string giving the name of the reference image.)

at any rate the LN reference image was not a subframe, it was an integrated master. this probably has something to do with the problem, however, i was under the impression that it was OK to use an integrated master as the LN reference. maybe it should be LinearFit to a representative subexposure first.

rob
 
pfile said:
at any rate the LN reference image was not a subframe, it was an integrated master. this probably has something to do with the problem, however, i was under the impression that it was OK to use an integrated master as the LN reference. maybe it should be LinearFit to a representative subexposure first.

Yes, I think LN non-frame references are OK, and LinearFit should not be necessary. What needs to change is MureDenoise. I am working on a fix, and with some additional metadata from LN and II, I think a solution may be possible.
 
sounds great, thanks for adding that support.

still, it sounds like the median of the LN stack was for some reason abnormally low? i understand that Mure has problems with the low median but it seemed like you were saying the low median is in itself a problem.

rob
 
Back
Top