one possible problem here is that DBE has a single tolerance value which is applied to all 3 channels of an RGB image. thus if you have one or two channels which have much higher background values than the other(s), you may end up increasing the tolerance so high in order to capture the weaker channel that the stronger channels are oversampled. the only way around this would be to do a pre-normalization of the 3 channels, for instance by breaking out the 3 channels and linear fitting 2 of them to the third, then putting the RGB image back together and then run DBE. or, another thing to try would be to first do a BackgroundNeutralization on the RGB image, then run DBE.
that might not be the problem though - as adam points out there can be other sources for these gradients which could be sensor-related. to his point about desaturating the background, one trick that i've seen posted here is to make a very strong mask which protects the stars and foreground object (with RangeMask) and then use the AtrousWavelets tool to remove the large-scale structures with the tool set to Chrominance mode. that will pretty much eliminate the large-scale color blotches in the image, but you have to be very careful as you can very easily desaturate the stars or your target.
rob