Here is an option, which may reduce outlier sensitivity and be more consistent:
SNRWeight * (2^ -(k * FWHMSigma)), where k is a nonnegative parameter.
With k set to 0, the weighting equals SNRWeight, which is a (scale / noise)^2 weighting metric that Juan has argued results a good maximum likelihood estimator for integration purposes.
FWHMSigma equals 0 on frames with median FWHM, +1 on frames with FWHM one sigma above median FWHM, and -1 on frames with FWHM one sigma below median FWHM.
Frames with negative FWHMSigma should be weighted higher than frames with positive FWHMSigma.
Setting k to a positive value will do this. For example, with k set to 1, frames with 0 FWHMSigma are weighted SNRWeight, frames with +1 FWHMSigma are weighted 0.5 * SNRWeight (i.e. less weight), and frames with -1 FWHMSigma are weighted 2 * SNRWeight (more weight).
A k value of 1 may be too large, typically. At least on my frames k values like 0.25 to 0.5 seem more reasonable. Some examples are shown below. As before, run several integrations with different k values and choose the one you like best.
Basically, I like SNRWeight by itself because it has an objective basis. I just want to tweak it a little bit based on FWHM.
PS: You can use EccentricitySigma similarly as an additional multiplicative term in the weighting with its own parameter likewise if you wish.
Thanks,
Mike
Her are SNRWeight and FWHM for a set of frames:
Here is the resulting weighting with k set to 0.25. Frame 1 has a relatively poor FWHM, so its weighting is penalized. Frames 22 and 23 have relatively good FWHM, so their weighting are rewarded:
Here is the resulting weighting with k set to 0.5. Same as before except FWHM variations have more effect: