Hi - I recently got a one shot colour camera to use with a mobile scope (I normally use a mono camera and filter wheel).
While I've been getting images that are not bad for a suburban location outside of London, I've been struggling with the colour balance - my images are looking too red.
I've spent the morning trying to work out why - turns out that for whatever reason, the red pixels in my flats have about half the exposure of the green and blue pixels. As a result, when the flat is applied, the red pixels are scaled to about twice the value of the blue and green pixels - this seems to persist through colour calibration.
I enclose two images of M31 - the first using the standard flats and about 5 hours of data - as you can see, the image is pretty good, but there is a red/brown tinge to the galaxy, hardly any blue at all.
For the second, I used Python to multiply red pixels by a constant amount in the flat so that the median value was about the same for red, green and blue pixels, then reprocessed one night's subs.
The processing is not as good (it was just done quickly), but as you can see, the colour balance is much better.
This made me wonder if the process for correcting flats for OSC is correct - should the various colours be scaled independently to avoid this effect?
Colin