How do you reduce memory requirements of a large hash

T

timnels

I've tried using DBM::Deep to store a large hash on disk since I am
getting "Out of memory" errors. It works, but runs too slow, so I was
considering using ?DBM_File to attempt the same thing, something like:

$inputs = {};
tie(%{$inputs}, 'NDBM_File', 'foo', O_RDWR|O_CREAT, 0666)
or die "Couldn't tie SDBM file 'foo': $!; aborting";

and then checking "ps" as it's adding 200,000 records, it seems to be
chewing up memory as I add those keys whereas DBM::Deep doesn't. Am I
missing something obvious here?

Thanks.
 
X

xhoster

I've tried using DBM::Deep to store a large hash on disk since I am
getting "Out of memory" errors. It works, but runs too slow, so I was
considering using ?DBM_File to attempt the same thing, something like:

If you store the data to disk rather than to memory, it is going to be
slow. If it is stored to memory as well as disk, then it will not cure your
"out of memory" errors. There is no general solution to this problem,
other than to buy more memory. There may be specific solutions, but they
would depend on the specifics of what you are doing with this large hash.
$inputs = {};
tie(%{$inputs}, 'NDBM_File', 'foo', O_RDWR|O_CREAT, 0666)
or die "Couldn't tie SDBM file 'foo': $!; aborting";

and then checking "ps" as it's adding 200,000 records, it seems to be
chewing up memory as I add those keys whereas DBM::Deep doesn't. Am I
missing something obvious here?

The docs for NDBM_File suggests that its purpose is data persistency, not
memory efficiency.

Xho
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,995
Messages
2,570,230
Members
46,819
Latest member
masterdaster

Latest Threads

Top