Hi Stephane,
how do I make sure that the DBE model generated is correct
Excellent question. These things should not be taken for granted without verification.
After subtracting a good background model, the image should be
uniformly illuminated (what we often call a
flat image informally). To evaluate uniformity, you can use different strategies. Personally, I think that a careful visual inspection is the best practical approach. I often apply a wild stretch and color saturation boost to a temporary duplicate of the corrected image. A strong nonlinear stretch immediately shows which areas of the image are poorly corrected, either overcorrected, which causes too dark areas, or undercorrected, which means that the light pollution gradients have not been properly removed.
But one can be more exhaustive, of course. A more objective way to test for illumination uniformity is to define a different DBE instance, that is an instance with different samples over background regions, and generate a second background model. The generated model should be essentially a constant function (an image filled with a constant value). This method applies a counter-test criterion to validate a background model.
Yet another way --100% qualitative but very sensitive-- is to generate a 3D profile of the image. You can use the excellent 3DPlot JavaScript script by Andrés Pozo and David Serrano. The script cannot be used with a large image unless you have a power machine, but you can downsample the image strongly before applying the script, because the background illumination profile is an extremely smooth function.
Hope this answers your question. If not, we can feed our brains to find more and better ways
![smile :)](http://pixinsight.com/forum/Smileys/default/smile.gif)