Crash to Desktop - Large file size Image Integration

monty

Member
I recently got my first cooled osc camera, ZWO ASI2600mc. I have been working on gathering and processing data on M45. Using the default settings on WBPP, I had couple of crashes during image integration when I was processing around 150 subs, and but it finally completed on 3rd try. Today I am working on processing around 270 subs and during image integration process, PixInsight just closes when its working on channel 1/3. I don't get any errors. It just closes. It happed three consecutive times.

I unchecked Automatic buffer size, and set it to 64mb and 1024mb stack size. This time it did not crash on me.
I am running on Kubuntu 20.04 (dedicated OS for PixInsight, so no other programs running in the backgroud)
CPU Ryzen 3900
Mem: 64GB
Storage: 1TB NVME.
parLatest version of PixInsight.
ZWO ASI 2600mc Camera file size: 50MB.
 
Last edited:
I have been running additional tests on different buffer size and stack size. Here is what happens on my system
Auto -> PI crashes. (I am assuming Linux is killing PI as it runs of out memory)
Buffer 16mb, Stack 1024mb worked ( but it takes a long time)
Buffer 32mb, Stack 1024mb worked
Buffer 64mb, Stack 2048mb worked
Currently testing Buffer 300mb, Stack 2048mb.

I did not check to see the run time for each of the configurations, but may do that when I get some time.
 
Last edited:
Quick update, I had few more crashes today running local normalization. It happened about three times during 1st part of LN. (loading the files). Same behavior as before. Memory would max out to 64GB and than crash to desktop. This time I was processing 259 subs, and each sub size is 296MB.
 
This is not a bug at all, just the expected behavior on Linux.

This is a "kill process or sacrifice child" Linux kernel event caused by a complete memory exhaustion. Although the automatic buffers size mode of ImageIntegration tries to prevent these issues as much as possible, it has certain limits, especially when some highly demanding working modes are selected (such as large-scale pixel rejection for example). LocalNormalization is another demanding process.

You definitely need more RAM to perform these tasks on the huge data sets that you are processing. A minimum of 96 GiB of RAM would be necessary, and 128 GiB is strongly recommended. Trying to run these tasks with just 64 GiB is not realistic.
 
Thanks for the update Juan. I am looking into upgrading my memory to 128mb. (Good thing I had planned for it by only using 2 32GB ram sticks). Until than will same problem happen on windows version ?
 
Everything would be worse on Windows regarding memory management, especially under high-pressure situations.
 
Well time to find a way to plant my Christmas gift idea in my wife's head. Or am I asking too much now :p
 
Back
Top