Forgive me if I have this wrong or if it has been raised before. I did a forum search and one other user mentioned the same issue but there was no response that clarified the issue. I am trying to use the statistics to check/verify the system gain, etc. of my DSLR.
Process I am following is:
- Debayer a flat frame (Debayer -> Bilinear).
- Extract the green channel (ChannelExtraction)
- Create a preview in a small, flattish part of the flat.
- Run statistics on the preview.
If I check the StdDev and Variance with the 'Range' dropdown set to 'Normalised Real [0,1]', then I get a meaningful result. The StdDev figure is the square root of the Variance figure, as you would expect, in this case Variance is approximately 2.2E-04 and StDev is 1.48E-02
If I set the 'Range' dropdown to '16-bit [0,65535]' then StdDev (in this case) is 971.428 and Variance is 14.400. Clearly this is wrong since Variance is much smaller than StdDev! The same issue occurs if you select any other bit range from the 'Range' dropdown.
Now I can happily work with the Normalised Real figures, but it does not exactly inspire confidence that something so basic as image statistics are displayed wrongly.
As ever, happy to be told I am an idiot and doing it wrong or don't understand, but pretty sure I am not (for once).