C
Catherine Moroney
I'm writing a python program that reads in a very large
"pickled" file (consisting of one large dictionary and one
small one), and parses the results out to several binary and hdf
files.
The program works fine, but the memory load is huge. The size of
the pickle file on disk is about 900 Meg so I would theoretically
expect my program to consume about twice that (the dictionary
contained in the pickle file plus its repackaging into other formats),
but instead my program needs almost 5 Gig of memory to run.
Am I being unrealistic in my memory expectations?
I'm running Python 2.5 on a Linux box (Fedora release 7).
Is there a way to see how much memory is being consumed
by a single data structure or variable? How can I go about
debugging this problem?
Catherine
"pickled" file (consisting of one large dictionary and one
small one), and parses the results out to several binary and hdf
files.
The program works fine, but the memory load is huge. The size of
the pickle file on disk is about 900 Meg so I would theoretically
expect my program to consume about twice that (the dictionary
contained in the pickle file plus its repackaging into other formats),
but instead my program needs almost 5 Gig of memory to run.
Am I being unrealistic in my memory expectations?
I'm running Python 2.5 on a Linux box (Fedora release 7).
Is there a way to see how much memory is being consumed
by a single data structure or variable? How can I go about
debugging this problem?
Catherine