For the most part PI 1.7 is very stable (I run it for days without restarting at times), but lately while using the BatchPreprocessing script it will randomly crash while determining all the noise estimates before the final integration. I'm integrating several hundred Canon XTi CR2 images, so maybe it's just the number of subs that isn't being handled gracefully. Smaller sets never seem to be an issue however.
I'm curious; how much noise data is actually stored in the cache for a single image? Given the speed of modern day disks it would have to be megabytes of data to really be a performance concern in writing the data to permanent storage after each evaluation. Perhaps it's the writing out of the entire cache that is the concern...
I'm not a file format expert, but does the FITS format allow storing data in the header of the file? Would it be beneficial to store the noise evaluation in the file itself, so if the cache doesn't contain the necessary data, the file header is checked and finally fall back to image evaluation if the header doesn't contain the data? Reading file headers should be fairly quick - much faster than reading the entire file and evaluating the image again.
I don't feel comfortable using PI 1.8 at this time due to the issues still being resolved and the number for scripts/modules that are still being updated by their authors. I'll wait until at least a month after GA before really deciding if I should test those waters. Hope you understand.
In any case, thanks for taking look.
Craig