Dear PI Users,
there might be many Users with a cooled CCD astro camera.
I found an irritating issue with those cameras.
Commercial advertising promise us an "16bit" sampling, that would be outperfoming to those
DSLRs. OK, this seemd logical to me.
But: After I read this article:
http://www.ccd.com/ccd111.html and computed the dynamic range
of some Sony or KODAK chips and the readout noise of some electronics by ATIK, SBIG, ..., I realized that
no chip/no camera manufacturer really hit the 16bit limit, that they promise.
Example 1 (low cost camera): ATIK 320E with SONY ICX-274
Readout noise = 3 e
Full-Well-capacity: 14000 e
Dyn. Range -> 14000 : 3 = 4666 or about 12-13 bit sampling depth (2^12 to 2^13).
Example 2: Moravian Instruments G3-11000 with KAI 11002
Readout noise = 14 e
Full-Well-capacity: 60000 e
Dyn. Range -> 60000 : 14 = 4285 or about 12-13 bit sampling depth (2^12 to 2^13).
Although those cameras have a 16bit A/D device, this would be useless as they sample a 12bit range with 16bit device ?
Although the CCD Cameras provide cooling and a better quantum efficiency than DSLRs:
Do they
really produce more differentiated images than the DSLRs with "real" 14bit sampling ?
When will this advantage really come to the picture ? In stacking ? In low areas where there is most noise (I think of
dark nebula) ??
Or am I completely wrong ?
Feedback or explanation would be very helpful,
kind regards !
christoph