New script for denoising linear monochannel images

Thanks Mike- I guess I'll just try a few more times. Good to know that there is no point using a mask.

Thanks Rob- The resultant DBE image has massive blotches over it and is unusable. This remains even if I press the nuclear button, the undo nuclear and redo nuclear. Obviously a picture is worth a thousand words so I'll post one in a day or two (I'm presuming with Valentine's day I may be precocupied- here's hoping).

Thank you both.
 
the nuclear button is the STF auto-stretch i was referring to. most likely all that's happening is STF is way overstretching the image based on the new noise profile.

rob
 
Nick, you might try Image > Screen Transfer Functions > Use 24-bit STF LUTs. This might solve the problem. On my tests, the background model image in particular sometimes appears to be posterized with normal STF, but this is just an normal STF quantization issue. 24-bit LUTs typically solves this type of problem. Occasionally 24-bit LUTs fixes display problems on processed integrations themselves. So this might be something to try.

An alternative to 24-bit LUTs is to simply apply the STF via HistogramTransformation, which uses a high precision algorithm and avoids this problem.

Mike
 
Rob has it right. If you denoise and then autostretch you get a terrible looking image because of an overstretch. Nothing to do with dbe. Just try denoising and  then apply autostretch again before you do anything else?you will get a crappy looking picture.
Geoff
 
Oh yes, running the script does not change the STF. So the result could look OK immediately after the script but then bad after a subsequent auto STF that overstretched.
 
Hi all and thanks for your inputs,

Tried the 24-bit STF- same poor image.
Tried the STF after MURE but before DBE and it all looks fine.
The below screenshot shows (left to right) all using the nuclear button (clicked several times, cancelled, clicked again :) ).

Left is untouched, middle is post MURE, right is post MURE-DBE.
2nd screenshot is just zooms of the above.

It does seem partially alleviated if I ignore the nuclear result and do it manually with histogram transformation (screenshot 3). I'm not sure if it is maybe because I'm so used to moving the skyfog peak into the 33% across position and this maybe is not necessary now I have so little noise in the black. Is it because the script is so good I have to recalibrate my brain?


 
Ok, lossy jpgs add confusion. Can you post a zip containing the 3 xisf's and screen shot of the MureDenoise script settings? If zip too big, maybe a link to a zip in dropbox?
 
Yes, Sorry about that.
Here is a link. Thank you.

https://www.dropbox.com/s/2dp0s9l95x5zij4/red_MURE_DBE.zip?dl=0
 
Thank you Nick. Two things:

1) The DBE image is for some unknown reason overstretched. Here is an example of how to remove the overstretch: Open DBE image, open STF process, make sure the track view 'check' icon is enabled and the DBE image is the active window, click on STF's wrench icon, you will see on the top row the numbers 0.005630, 0.002633, 1.0. Change the first two to 0.0014, 0.015, leave the last 1.0. These changes will lighten the image and reduce the stretch. This is only a quick example, you can tweak the numbers and/or move the STF sliders as you wish, of course.

2) The denoising is too aggressive. I am guessing that this is because ImageIntegration used widely varying frame normalization parameters, and the MureDenoise is not able to track these differences sufficiently well. Please try setting the script's Variance Scale parameter to say 0.7 and denoising once again.

Basically, if you zoom way into red_MURE, with the appropriate STF, you will see a slight checkerboard pattern in adjacent pixels. Checkboarding indicates over denoising. Note, problems like this can happen if the script's detector parameters are set incorrectly (ie, Gain and Gaussian noise). If you have not already done so you can run the script MureDenoiseDetectorSettings as a double check for these values.

Mike
 
Thanks Mike. I will give it a go again tonight.

It seems odd the screen stretch over stretches. I've never even noticed teh STF wrench icon so something new to play with :)
Yes I noticed the checkerboard pattern too but wasn't sure if that was just the lesser of two evils (noise being the other evil).
I have used the Mure detector settings script and the numbers seem to be moreorless identical whichever flats and baises I use.

BTW I'm not really clear about what the Flat selection in the MureScript brings to the party.

Many thanks. I'll let you know how I get on. The nice thing is even with my incorrect settings the result is good. :)
 
Thanks Mike,

I can't actually find the  "track view 'check' icon". What does it look like? I've tried hovering over everything in the process but no tool-tip seems to say "track view". That said, the image does change when I slide stuff, type in the numbers in the wrench etc so I guess it is on.

The changing the variance seemed to improve matters. Then PI froze (I've had an issue like a lot of people, something to do with the graphics card and Windows) - I digress. More playing around needed for me :)

Thanks for all your help.

 
A quick question about bias frames chosen for the mure denoisedetectorsettings, which may account for slightly weird results.

I use a "super master bias" for my calibrations (which I do in WBPP). The bias frames I've been loading into the detector settings script are uncalibrated ones and obviously not master bias or super master bias.

I've tried loading both the master bias and the super master bias  into the detector settings alongside my raw untouched bias and (as I might expect even with my limited understanding of these things) massively changes the generated values.

So my question is, what is the procedure/considerations for using master super bias calibrated files?

Many thanks


 
Yes, the detector setting script needs raw, untouched frames, both flats and biases (or darks if you use darks rather than biases). If you instead use masters in this script, you will likely get incorrect values that if used for denoising will likely compromise denoising accuracy.

You are free to use masters and/or super masters for calibration purposes without restriction. Caibration always adds the residual noise in the masters to the result, so less noisy masters are always better.
 
nm303 said:
The changing the variance seemed to improve matters.

Ok, good.

Looking at the metadata recorded in red.xisf, it appears some frames in the integration were significantly modified by ImageIntegration's normalization process. For example, the 18th frame was scaled by about 2.2 and offset by about 0.4 relative the the reference frame. This sort of result indicates that ImageIntegration believes that the frame was exposed less than half as long as the reference frame, and also had a significantly different median sky background level. The script tends to have major difficulty tracking the impact on noise statistics of these large normalization differences, and hence the need to tweak the variance scale parameter.

This is a problem with the script. Ideally it would handle these types of normalization differences well without intervention.
 
Hi Mike,

Thanks for the clarification on the use of raw bias and flats. The script does seem very clear in its request for these but thought it worth double-checking.

I guess I can just run the script a few times with different variance scale settings and see which looks best. Will the script work appropriately with previews and/or massive crops, for such testing purposes?

 
Back
Top