-
-
Notifications
You must be signed in to change notification settings - Fork 31.5k
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
test_mmap uses cruel and unusual amounts of disk space #68141
Comments
My laptop is running 64-bit Linux (14.10). It has 4.6GB of free disk space. Naturally that's not enough to run test_mmap. When I run the test suite, test_mmap consumes all available disk space, then fails. (Hopefully freeing all its temporary files!) If I used "-j" to run more than one test at a time, this usually means *other* tests fail too, because I'm running multiple tests in parallel and there are plenty of other tests that require, y'know... any disk space whatsoever. The documentation for the test suite ("./python -m test -h") says that "-u largefile" allows tests that use more than 2GB. Surely test_mmap's delicious 800PB tests should be marked largefile-enabled-only? I'd like to see this fix backported to 3.4 too. And if 2.7 shows this behavior, maybe there too. |
What is your filesystem? |
By the way, I'm pretty sure test_mmap doesn't require 4GB of disk space here - Ubuntu 14.10 -, since it runs in only 0.071s; also it runs fine from a partition with only 3GB free space. |
ZFS. |
Perhaps ZFS doesn't support sparse files? Or perhaps there's another way to convince it to create a sparse file? How long does test_mmap take to run on your machine? |
Wall time was 3 minutes 38 seconds. % time ./python -m test test_mmap |
Is there anything we can do about this? Should we skip this test on some file systems? |
Probably, we can run |
Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.
Show more details
GitHub fields:
bugs.python.org fields:
Linked PRs
test_mmap.LargeMmapTests.*
with a timeout #101774The text was updated successfully, but these errors were encountered: