I'm curious... I have run 3 old image sets through BatchPreprocessing without MRS reverting to k-sigma during integration. The same image sets produced these errors frequently when manually preprocessing (dslr_raw).
Some issues I couldn't resolve with one image set

My imaging system such as it is, scheduled half a session of 30 second shots, instead of 180 seconds. Consequently, the light data is useless, but scaling factors are quite evident against these 30 second frames using the preprocessing script. Not so following the dslr_raw workflow, MRS reverting to k-sigma at the slightest hint of stress.
So what is the difference? A newer computer with more memory and a faster processor, maybe - does MRS revert to k-sigma because of memory issues?
Anyway, it would be nice to know whether optimizations have been employed in the script.