Help with Background Extraction of complex Gradient

Hello there,

I am not capable of extracting the Background/Gradients of my linear Image properly. I also tried it on each channel itself - the result was not really better.
I tried ABE and DBE, but I think (and hope ;) )that the result can still be way better.

What I already did with the image after stacking:
- RGB Working space (set all to 1)
- Linear Fit
- Dynamic crop

Any help is appreciated.
Thanks!

XISF-File:
 
A few points to start with:
  • the image seems to be suffering from uncorrected vignetting; what flats (if any) did you use and how were they applied?
  • This image is not undersampled; you are really not gaining anything from drizzle except bigger files and longer processing times.
  • It would be helpful to know more about the data (how many frames; what calibration frames; what calibration workflow).
DBE with a lot of hand-picked samples (784) can get rid of most of the background (though I suspect a well-calibrated flat would do better):
1621245567081.jpeg

The faint residues left are very low level (a few ADU), only made visible by the STF. You can see that some faint banding structure (almost certainly sensor-related) is becoming visible. Both of these would disappear with a little non-linear adjustment.
Note: your FITS header has "effective focal length"=50mm; ImageSolver gets 1011mm (using drizzle-modified pixel size 1.88um).
 
Last edited:
Thanks a lot! :)
I will retry DBE with a lot of hand-picked samples.

Unfortunately I wasnt able to do any flats, because I had to start/end the session way after sunset / before sunrise.

I also wondered regarding drizzle: I tried the Lights-Integration with and without it, but the results were quite better with drizzle - I can zoom in about 1.5x to 2x times more with drizzle and still have the same smoothness - compared to the stacking result without drizzle.

It contains: 50 Lights (90s each) on ISO 400, 100 Bias (=> Masterbias => Superbias), Dithering was used of course, but no darks (DSLR..)
 
I can zoom in about 1.5x to 2x times more with drizzle and still have the same smoothnes
This is rather like using a high magnification eyepiece on a telescope - you can "zoom in" (read "magnify") more, but once you reach the resolution limit (either of the telescope, or more usually of the ambient seeing) you are just magnifying the same low angular resolution image - you are not getting any more resolution (so you would get the same "zoom" by up-sampling and smoothing your undrizzled image).
 
Without flat correction, the DBE background subtraction is probably as good as you can do. With complicated gradients I sometimes get a marginal improvement by applying DBE a second time, concentrating samples on any residual structure. However, in this case I think this has already reached the point of diminishing returns.
 
About drizzle: This is a comparison of the same star shown in same size:
Left is with 2x drizzle (3:1) and right without it (6:1).
Annotation 2021-05-17 124938.png


If I understand you correctly, I can achieve this improvement also without using drizzle but instead using up-sampling and smoothing after the general processing?


Anyways: Thanks for your help regarding DBE!
 
I applied ABE with function degree 1 and then ABE with function degree 4. The integration with a strong STF applied looks like this:
artifacts.JPG


The faint residues left are very low level (a few ADU), only made visible by the STF. You can see that some faint banding structure (almost certainly sensor-related) is becoming visible. Both of these would disappear with a little non-linear adjustment.
The "banding" is not a sensor-related banding structure, because it is not precisely aligned to the sensor edges. These structures seem to be caused by not well-matching field of view of some subframes.

The bright artifact near the center does not seem to be a vignetting relic either. I might be caused by stray light, but I don't think so. I recommend to blink the calibrated, not aligned subframes and check whether this artifact is contained only in some of them in order to exclude them from integration.

Bernd
 
If I understand you correctly, I can achieve this improvement also without using drizzle but instead using up-sampling and smoothing after the general processing?
If you upload an undrizzled image, I'll try and demonstrate!
I applied ABE
ABE is really not up to this problem. You need the selective local sampling of DBE.
it is not precisely aligned to the sensor edges
Are you sure? Remember, this is a dithered, aligned, integrated image; are you sure the bands weren't aligned in the raw subs?

Oops - I've only just realised this post was not by the OP.
 
The bright artifact near the center does not seem to be a vignetting relic either. I might be caused by stray light, but I don't think so. I recommend to blink the calibrated, not aligned subframes and check whether this artifact is contained only in some of them in order to exclude them from integration.
Unfortunately it's present on every subframe (the longer the exposure the more present it seems to be). It could be light pollution (stray light), but there really shouldnt be something nearby my location. At least I havent seen anything and it was never there at other nights.

Are you sure? Remember, this is a dithered, aligned, integrated image; are you sure the bands weren't aligned in the raw subs?

I cant see them in the subframes and a few of the exposures capture an about 10% different FOV (the galaxy on those is a bit down/left in comparison to the integrated image) => so those bandings could really have this cause. (I had a guiding problem once that night => the different FOV probably comes from there).

I will try DBE againg today or tomorrow and will post my result here again :)
 
I am currently working with DBE on each RGB-channel itself. R and B workd great for me.
But the gradient (the big one in the middle) is far stronger in G. The DBE from R/B doesnt work here, so i still need to adopt it for G, hopefully I succeed.
What could be the case that the gradient is much stronger in G? Is it due to the Bayer Pattern GBRG - but shouldnt the gradient then be as easy to remove as for R/B?
 
I am currently working with DBE on each RGB-channel itself. R and B workd great for me.
But the gradient (the big one in the middle) is far stronger in G. The DBE from R/B doesnt work here, so i still need to adopt it for G, hopefully I succeed.
What could be the case that the gradient is much stronger in G? Is it due to the Bayer Pattern GBRG - but shouldnt the gradient then be as easy to remove as for R/B?

My bad... I had to increase the tolerance in DBE for the G channel (which makes sense, because the difference within the background should be about doubled in G compared to R or B..).

The channel-combined and STF-stretched result now looks like this:


I am quite happy with that already.
But I definitely need to work on the "coloured" noise in the background (green/purple), but maybe this doesnt even come up like that with a careful stretch. Or maybe I need to reduce the contrast within the background in the non-linear state?
 
I definitely need to work on the "coloured" noise in the background (green/purple)
Because noise is uncorrelated in the separate colour channels, it will always have this "mottled" appearance. No amount of balancing will make it uniformly grey. Provided the averaged background is neutral, this pixel-by-pixel variation is expected. However, MLT adjustment can smooth it out a bit.
 
If I understand you correctly, I can achieve this improvement also without using drizzle but instead using up-sampling and smoothing after the general processing?
Try running the following process on your undrizzled (~6000x4000) integrated image, and compare with the drizzle. I think you will be surprised.
1621334648822.png
 
Back
Top