G
Garry Hodgson
a friend is trying to track down the source of a mysterious
slowdown that's happening in a webware app he's doing.
he's got an external precompiled application that he invokes
from python using commands.getoutput(). usually it runs quickly
(1-2 secs), but sometimes it take much longer (80-90 secs).
he's instrumented the program to verify that it's the query invocation
that's spending all the time. he's run the query in a test rig outside
of python, and it seems to run normally all the time. but when run
from python, via webware, he gets this wide performance variations.
his current hypothesis is that when python runs his command, it
"nices" it down in priority, so it's more susceptible to other load
on the machine. i searched the python source, and don't see anyplace
that appears to do this. but i thought i'd check here anyway.
can anyone support or reject his theory?
thanks
slowdown that's happening in a webware app he's doing.
he's got an external precompiled application that he invokes
from python using commands.getoutput(). usually it runs quickly
(1-2 secs), but sometimes it take much longer (80-90 secs).
he's instrumented the program to verify that it's the query invocation
that's spending all the time. he's run the query in a test rig outside
of python, and it seems to run normally all the time. but when run
from python, via webware, he gets this wide performance variations.
his current hypothesis is that when python runs his command, it
"nices" it down in priority, so it's more susceptible to other load
on the machine. i searched the python source, and don't see anyplace
that appears to do this. but i thought i'd check here anyway.
can anyone support or reject his theory?
thanks