S
Slaunger
OS: Win XP SP3, 32 bit
Python 2.5.4
Hi I have run into some problems with allocating numpy.memmaps
exceeding and accumulated size of about 2 GB. I have found out that
the real problem relates to numpy.memmap using mmap.mmap
I've written a small test program to illustrate it:
import itertools
import mmap
import os
files = []
mmaps = []
file_names= []
mmap_cap=0
bytes_per_mmap = 100 * 1024 ** 2
try:
for i in itertools.count(1):
file_name = "d:/%d.tst" % i
file_names.append(file_name)
f = open(file_name, "w+b")
files.append(f)
mm = mmap.mmap(f.fileno(), bytes_per_mmap)
mmaps.append(mm)
mmap_cap += bytes_per_mmap
print "Created %d writeable mmaps containing %d MB" % (i,
mmap_cap/(1024**2))
#Clean up
finally:
print "Removing mmaps..."
for mm, f, file_name in zip(mmaps, files, file_names):
mm.close()
f.close()
os.remove(file_name)
print "Done..."
which creates this output
Created 1 writeable mmaps containing 100 MB
Created 2 writeable mmaps containing 200 MB
.....
Created 17 writeable mmaps containing 1700 MB
Created 18 writeable mmaps containing 1800 MB
Removing mmaps...
Done...
Traceback (most recent call last):
File "C:\svn-sandbox\research\scipy\scipy\src\com\terma\kha
\mmaptest.py", line 16, in <module>
mm = mmap.mmap(f.fileno(), bytes_per_mmap)
WindowsError: [Error 8] Not enough storage is available to process
this command
There is more than 25 GB of free space on drive d: at this stage.
Is it a bug or a "feature" of the 32 bit OS?
I am surprised about it as I have not found any notes about these
kinds of limitations in the documentation.
I am in dire need of these large memmaps for my task, and it is not an
option to change OS due to other constraints in the system.
Is there anything I can do about it?
Best wishes,
Kim
Python 2.5.4
Hi I have run into some problems with allocating numpy.memmaps
exceeding and accumulated size of about 2 GB. I have found out that
the real problem relates to numpy.memmap using mmap.mmap
I've written a small test program to illustrate it:
import itertools
import mmap
import os
files = []
mmaps = []
file_names= []
mmap_cap=0
bytes_per_mmap = 100 * 1024 ** 2
try:
for i in itertools.count(1):
file_name = "d:/%d.tst" % i
file_names.append(file_name)
f = open(file_name, "w+b")
files.append(f)
mm = mmap.mmap(f.fileno(), bytes_per_mmap)
mmaps.append(mm)
mmap_cap += bytes_per_mmap
print "Created %d writeable mmaps containing %d MB" % (i,
mmap_cap/(1024**2))
#Clean up
finally:
print "Removing mmaps..."
for mm, f, file_name in zip(mmaps, files, file_names):
mm.close()
f.close()
os.remove(file_name)
print "Done..."
which creates this output
Created 1 writeable mmaps containing 100 MB
Created 2 writeable mmaps containing 200 MB
.....
Created 17 writeable mmaps containing 1700 MB
Created 18 writeable mmaps containing 1800 MB
Removing mmaps...
Done...
Traceback (most recent call last):
File "C:\svn-sandbox\research\scipy\scipy\src\com\terma\kha
\mmaptest.py", line 16, in <module>
mm = mmap.mmap(f.fileno(), bytes_per_mmap)
WindowsError: [Error 8] Not enough storage is available to process
this command
There is more than 25 GB of free space on drive d: at this stage.
Is it a bug or a "feature" of the 32 bit OS?
I am surprised about it as I have not found any notes about these
kinds of limitations in the documentation.
I am in dire need of these large memmaps for my task, and it is not an
option to change OS due to other constraints in the system.
Is there anything I can do about it?
Best wishes,
Kim