-
Notifications
You must be signed in to change notification settings - Fork 63
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Segmentation Fault from sep.extract() with large arrays #122
Comments
Is your array datatype float32 or float64? Assuming it is float32, a 24000x24000 array is 2.304e9 bytes (24000^2 * 4), and a 23000x23000 array is 2.116e9 bytes. The maximum value a 32-bit int can hold is 2.147e9, which is right in between. That leads me to believe the problem is that we're using 32-bit ints to address memory somewhere in |
Does the segfault also occur with an array of all zeros? If it doesn't, that would tell me the problem is likely a variable that has to do with the number of pixels belonging to an object. |
I think you are right on target. My array is indeed a float32 array. I upscaled it to a float64 array at 17000x17000 (dimensions that worked at float32), and it failed with the same seg fault error. 17000^2 * 8 = 2.312e9, so I expect that the int32 memory address is indeed the problem! I'll test the all zeros now.... |
Same results with all zero arrays at both float32 and float64. Below the 2.147e9 threshold, everything works fine, above it I get the Seg Fault error. Thanks for the help! |
We have recently encountered the same problem with new images from JWST/NIRCam. Passing an array of 24000x24000 pixels (float32) to sep.extract gives a segmentation fault. So, it seems that this has not been fixed. Since we a interested in using sep on possibly even larger images in the future, we would like to know if there is any intention to fix the problem and use 64-bit ints to address memory in a future version? |
Just re-iterating here that it would be great to find the solution to this issue. I have encountered the same problem attempting source extraction on images from JWST NIRCam. For now I am splitting the images into multiple subsets before extracting, but that adds a lot of additional headache. |
This should now be fixed with the latest release. |
I'm running into a Segmentation fault error ("Fatal Python error: Segmentation fault", "Segmentation fault (core dumped)") when calling sep.extract() on large images/arrays (36000x36000 elements).
Specifically, I am calling: "sep.extract(image,5,err=bkg.globalrms)" where image is a 36000x36000 NumPy array, and bkg is the output of sep.Background(image)
As part of my debugging, I've restricted the input array size and found that the issue first appears at a size of around 24000x24000 (sep.extract() works fine at 23000x23000).
Is there a maximum array size that sep.extract() is designed to work with? sep.Background() runs just fine on the 36000x36000 arrays, so I am surprised that sep.extract() fails.
While it is straightforward to divide my images into smaller pieces and run sep.extract() separately on each piece, I am raising this issue because it is more convenient to do the extractions on larger images (within memory restrictions of course).
I am using sep 1.2.0 installed via pip (I also saw this problem with sep 1.0.3) with python 3.7.10. I have plenty of free system memory when calling sep.extract().
Thanks!
The text was updated successfully, but these errors were encountered: