T
Tom Davis
I am having a problem where a long-running function will cause a
memory leak / balloon for reasons I cannot figure out. Essentially, I
loop through a directory of pickled files, load them, and run some
other functions on them. In every case, each function uses only local
variables and I even made sure to use `del` on each variable at the
end of the loop. However, as the loop progresses the amount of memory
used steadily increases.
I had a related problem before where I would loop through a very large
data-set of files and cache objects that were used to parse or
otherwise operate on different files in the data-set. Once again,
only local variables were used in the cached object's methods. After
a while it got to the point where simply running these methods on the
data took so long that I had to terminate the process (think, first
iteration .01sec, 1000th iteration 10sec). The solution I found was
to cause the cached objects to become "stale" after a certain number
of uses and be deleted and re-instantiated.
However, in the current case, there is no caching being done at all.
Only local variables are involved. It would seem that over time
objects take up more memory even when there are no attributes being
added to them or altered. Has anyone experienced similar anomalies?
Is this behavior to be expected for some other reason? If not, is
there a common fix for it, i.e. manual GC or something?
memory leak / balloon for reasons I cannot figure out. Essentially, I
loop through a directory of pickled files, load them, and run some
other functions on them. In every case, each function uses only local
variables and I even made sure to use `del` on each variable at the
end of the loop. However, as the loop progresses the amount of memory
used steadily increases.
I had a related problem before where I would loop through a very large
data-set of files and cache objects that were used to parse or
otherwise operate on different files in the data-set. Once again,
only local variables were used in the cached object's methods. After
a while it got to the point where simply running these methods on the
data took so long that I had to terminate the process (think, first
iteration .01sec, 1000th iteration 10sec). The solution I found was
to cause the cached objects to become "stale" after a certain number
of uses and be deleted and re-instantiated.
However, in the current case, there is no caching being done at all.
Only local variables are involved. It would seem that over time
objects take up more memory even when there are no attributes being
added to them or altered. Has anyone experienced similar anomalies?
Is this behavior to be expected for some other reason? If not, is
there a common fix for it, i.e. manual GC or something?