J
Jason DiCioccio
I have written a long-running daemon in ruby to handle dynamic DNS updates.
I have just recently moved it from ruby 1.6 to ruby 1.8 and updated all of
its libraries to their latest versions (it uses dbi and dbd-postgres). The
problem i am having now is that it appears to start out using a sane amount
of memory (around 8mb) but then by the next day around the same time will
be using close to 200MB for the ruby interpreter alone. The daemon code
itself is 100% ruby so I don't understand how this leak is happening. Are
there any dangerous code segments I should look for that could make it do
this? The only thing I could think of is the fact that every returned
object from a sql query is .dup'd since ruby dbi passes a reference.
However, these should be getting swept up automatically by the garbage
collector. This is driving me nuts and I would love it if someone could
point me in the right direction..
Thanks!
-JD-
I have just recently moved it from ruby 1.6 to ruby 1.8 and updated all of
its libraries to their latest versions (it uses dbi and dbd-postgres). The
problem i am having now is that it appears to start out using a sane amount
of memory (around 8mb) but then by the next day around the same time will
be using close to 200MB for the ruby interpreter alone. The daemon code
itself is 100% ruby so I don't understand how this leak is happening. Are
there any dangerous code segments I should look for that could make it do
this? The only thing I could think of is the fact that every returned
object from a sql query is .dup'd since ruby dbi passes a reference.
However, these should be getting swept up automatically by the garbage
collector. This is driving me nuts and I would love it if someone could
point me in the right direction..
Thanks!
-JD-