Hi Jeff,
I'm having an issue when using the Batch Preprocessing Script in the new 1.8.7. The screen goes black after a few seconds and never recovers.
We cannot reproduce this problem. BPP runs normally on all of our working and testing machines, on all platforms.
So my question, is there a way to force PI to use the GeForce card and hopefully avoid the rendering issue?
On Windows, the PixInsight core application already attempts to use the discrete GPU exclusively on machines with dual graphics. This should work for both nVidia and AMD graphics cards. However, whether or not this works on a particular machine is a different thing; one never knows for sure on Windows, especially on laptops, where the concept of standards is pretty fuzzy. There is nothing more I can do to fix these issues.
What is really strange is that it works with 1.8.6 and not with 1.8.7. There are no changes at all that could justify such a different, and problematic, behavior.
I think this may be a memory issue. I reinstalled 1.8.7 and re-ran the same file set. 100 bias, 100 darks, 60 flats, 60 lights.
The process failed when it began to integrate the bias frames, but did not crash the computer so I was able to get the log file. The last portion is this:
2019-10-07 13:59:59] Reading FITS image: 16-bit integers, 1 channel(s), 3336x2496 pixels: done
[2019-10-07 13:59:59] Computing image statistics: done
[2019-10-07 13:59:59] Weight : 1.00000
[2019-10-07 13:59:59]
[2019-10-07 13:59:59]
[2019-10-07 13:59:59] Integration of 100 images:
[2019-10-07 13:59:59] Pixel combination .................. average
[2019-10-07 13:59:59] Output normalization ............... none
[2019-10-07 13:59:59] Weighting mode ..................... don't care
[2019-10-07 13:59:59] Scale estimator .................... MAD
[2019-10-07 13:59:59] Pixel rejection .................... Winsorized sigma clipping
[2019-10-07 13:59:59] Rejection normalization ............ none
[2019-10-07 13:59:59] Rejection clippings ................ low=yes high=yes
[2019-10-07 13:59:59] Rejection parameters ............... sigma_low=4.000 sigma_high=3.000 cutoff=5.000
[2019-10-07 13:59:59]
[2019-10-07 13:59:59] * Available physical memory: 13.126 GiB
[2019-10-07 13:59:59] * Using 2496 concurrent pixel stack(s) = 12.438 GiB
[2019-10-07 13:59:59]
[2019-10-07 13:59:59] Integrating pixel rows: 0 -> 2495: 80%
[2019-10-07 14:00:36] *** PCL Win32 System Exception: At address 00007FFE1E3EA839 with exception code C0000005 :
[2019-10-07 14:00:36] Access violation: invalid memory read operation at address 0000000000000000
[2019-10-07 14:00:36]
[2019-10-07 14:00:36] ************************************************************
[2019-10-07 14:00:36] * End integration of bias frames
[2019-10-07 14:00:36] ************************************************************
I read about disabling the automatic buffer size, so I did that in the ImageIntegration process (still failed, but gracefully), but is that setting also used in BPP? Is there somewhere else I can check?
Thanks,
Jeff