Author Topic: Noise Evaluation Weights  (Read 2506 times)

Offline ngc1535

  • PixInsight Old Hand
  • ****
  • Posts: 326
Noise Evaluation Weights
« on: 2018 April 22 11:16:39 »
Hi,

I am certain things are being done properly...however, I am unable to interpret the results.
When using Noise Evaluation in ImageIntegration the outputted weights in the processing console are not what I expect.
I combine a set of images that are 5 minute integrations and 15 minute integrations- I am expecting to see weights that are roughly .333 and 1 (or 1 and 3).
However, see weights that are all nearly 1 (.98 ... 1.08...etc).

If I do the same combine with exposure time weighting - of course I see the expected ratio of weights.

Where is my disconnect?

Thanks,
adam

Offline pfile

  • PTeam Member
  • PixInsight Jedi Grand Master
  • ********
  • Posts: 4729
Re: Noise Evaluation Weights
« Reply #1 on: 2018 April 22 13:03:24 »
i suppose if the NoiseEvaluation script shows similar noise in all the images then that's what's happening. it does seem counterintuitive that such different exposure lengths would have similar noise though.

which kind of image is first in the list? that's the reference image against which all the others are compared. in theory this should not matter too much but i wonder if changing it has any effect.

rob

Offline ngc1535

  • PixInsight Old Hand
  • ****
  • Posts: 326
Re: Noise Evaluation Weights
« Reply #2 on: 2018 April 22 13:16:56 »
Thanks for the reply Rob.
I believe I happened to have been a short exposure.
When I am at my computer again, I will try the experiment again with a long exposure as reference... but I can't see how, other than reversing the ratio of weights, this would affect anything.

Do you have data of different exposure times for which the results come up with weights approximately the ratio of exposure times?

The data I am using were taken under the same conditions as far as equipment and sky.

I am looking at this stuff as a sanity check of my understanding of how things work...but I need to stop and consider things when I get a result like this.

I hate imposing on others- but if it is helpful I am glad to make the calibrated and aligned data available  (it is already on my Google Drive).

-adam

Offline pfile

  • PTeam Member
  • PixInsight Jedi Grand Master
  • ********
  • Posts: 4729
Re: Noise Evaluation Weights
« Reply #3 on: 2018 April 22 16:22:19 »
i can look back thru my data, but generally i use 1800s for narrowband and 240s for RGB given the skies here, so i probably do not have any images of the same target with differing lengths (unless i changed the durations by mistake :)

rob

Offline RickS

  • PTeam Member
  • PixInsight Jedi
  • *****
  • Posts: 1298
Re: Noise Evaluation Weights
« Reply #4 on: 2018 April 22 17:40:49 »
I had a play with Adam's data over the weekend (we have been discussing in email) and tried a couple of SubframeSelector weighting schemes which represented the sub lengths better.  Unfortunately, they gave worse results than a "simple" noise evaluation weighted integration.

I combine a set of images that are 5 minute integrations and 15 minute integrations- I am expecting to see weights that are roughly .333 and 1 (or 1 and 3).
However, see weights that are all nearly 1 (.98 ... 1.08...etc).

You won't get a 3:1 improvement in SNR, Adam.  In a perfect world it would be around 1.7:1 (the square root of 3.)  That's still a much bigger difference in weighting than NoiseEvaluation gives on this data set.

Offline ngc1535

  • PixInsight Old Hand
  • ****
  • Posts: 326
Re: Noise Evaluation Weights
« Reply #5 on: 2018 April 22 18:35:14 »

Quote
You won't get a 3:1 improvement in SNR, Adam.  In a perfect world it would be around 1.7:1 (the square root of 3.)  That's still a much bigger difference in weighting than NoiseEvaluation gives on this data set.

Hmmm... I am definitely confused by this statement. In a ideal/perfect world with a noiseless detector- I would hope that the weights ratio would approximate the exposure time ratio with a linear detector. I guess I am stuck on the idea that the weights are like a "weighted mean" for combining images. The uncertainty in the value itself from photon noise and camera noise- this is more complicated sure (although I know of square roots of N... not cube roots... ).   

So... if I am at the telescope and I take 5 minute exposure and then follow it up by at 15 minute exposure... I can't see how the weights assigned to these images under the same conditions wouldn't approximate 1:3. This isn't a faith assessment... but just relies on the idea the camera is a linear detector?

-adam


Offline RickS

  • PTeam Member
  • PixInsight Jedi
  • *****
  • Posts: 1298
Re: Noise Evaluation Weights
« Reply #6 on: 2018 April 22 19:41:39 »
Hmmm... I am definitely confused by this statement. In a ideal/perfect world with a noiseless detector- I would hope that the weights ratio would approximate the exposure time ratio with a linear detector. I guess I am stuck on the idea that the weights are like a "weighted mean" for combining images. The uncertainty in the value itself from photon noise and camera noise- this is more complicated sure (although I know of square roots of N... not cube roots... ).   

So... if I am at the telescope and I take 5 minute exposure and then follow it up by at 15 minute exposure... I can't see how the weights assigned to these images under the same conditions wouldn't approximate 1:3. This isn't a faith assessment... but just relies on the idea the camera is a linear detector?

Sorry, Adam.  I'm thinking of the relative SNR and confusing that with the weighting.

Offline ngc1535

  • PixInsight Old Hand
  • ****
  • Posts: 326
Re: Noise Evaluation Weights
« Reply #7 on: 2018 April 23 07:22:55 »
A couple of other things:

1. It was suggested that perhaps the resampling via registration can cause an issue with the noise estimates (and thereby the weighting). I did try on the unregistered images- this did give different numbers- but again not the expected weights.
2. In fact I am still expecting a weighted mean result- which is that all of the weights will add to N-- the total number of images. In the above experiment this didn't happen... which also confused me.
3. If #1 is an issue- then I assume that using NN would alleviate the issue somewhat.

To summarize the thread, with an input of different exposure times and using noise evaluation in ImageIntegration should I expect to see weight ratios that approximate the exposure time ratios?

-adam

Offline chris.bailey

  • PixInsight Addict
  • ***
  • Posts: 235
Re: Noise Evaluation Weights
« Reply #8 on: 2018 April 24 00:20:19 »
Adam

A few thoughts.

Read noise is a constant to each frame duration but will be a higher proportion in the shorter frames. There are several articles around on the web by John Smith (Hidden Loft) and Stan Moore that are worth a read.

Have you tried with uncalibrated frames? Calibration adds its own noise contribution. This contribution will be similar for the short and long frames so a higher proportion in the short frames. Using a master dark that is based on very few dark frames can add significantly to the overall noise for instance. Dark scaling can do odd things to the noise evaluation weights too in my experience. Low signal flats can add a significant dollop of noise.

Imaging conditions can play a major part. If the 5 minute frames were taken under better skies than the longer ones then the weightings will reflect that. It only takes a bit of high cloud or rising moon.

The weighting is relative to the reference rather than an absolute metric so the weights won't sum to N. If you use the frame with highest S/N as a reference then all the weightings will be fractional.

Have you tried measuring the noise in a sample of the short and long frames (calibrated and uncalibrated) to see if there is an actual significant difference?

Chris

ps and worth checking the .fits header for the temperature set point being the same for all. Thats thrown me a couple of times i.e some at -10 and others at -20.
« Last Edit: 2018 April 24 00:30:25 by chris.bailey »

Offline ngc1535

  • PixInsight Old Hand
  • ****
  • Posts: 326
Re: Noise Evaluation Weights
« Reply #9 on: 2018 April 24 00:44:10 »
Adam

A few thoughts.

Read noise is a constant to each frame duration but will be a higher proportion in the shorter frames.

So small S/N so small weight for short exposures?

Quote
There are several articles around on the web by John Smith (Hidden Loft) and Stan Moore that are worth a read.

Yes, I am familiar with this literature.




Quote
Have you tried with uncalibrated frames?

Yes, the weights do not approximate the exposure times.

Quote
Calibration adds its own noise contribution. This contribution will be similar for the short and long frames so a higher proportion in the short frames. Using a master dark that is based on very few dark frames can add significantly to the overall noise for instance.

Master dark is made from a significant number of dark frames.


Quote
Dark scaling can do odd things to the noise evaluation weights too in my experience.

 Master Darks are scaled down from 1800seconds.



Quote
Low signal flats can add a significant dollop of noise.
This isn't an issue.

Quote
Imaging conditions can play a major part. If the 5 minute frames were taken under better skies than the longer ones then the weightings will reflect that. It only takes a bit of high cloud or rising moon.

In this case (which is why it is of such interest to me), some of the short and long exposures were taken under the same conditions sequentially.

Quote
The weighting is relative to the reference rather than an absolute metric so the weights won't sum to N. If you use the frame with highest S/N as a reference then all the weightings will be fractional.

Yeah, your right.

Quote
Have you tried measuring the noise in a sample of the short and long frames (calibrated and uncalibrated) to see if there is an actual significant difference?
I have not..but should. What is the best method?
If the noise is so low as to be not significant- then what is the weighting method? There is certainly a difference in signal (which is the part of course I am stuck on).

Quote
Chris