Running long script in the background

W

wattersmt

Hello,

I am trying to write a python cgi that calls a script over ssh, the
problem is the script takes a very long time to execute so Apache
makes the CGI time out and I never see any output. The script is set
to print a progress report to stdout every 3 seconds but I never see
any output until the child process is killed.

Here's what I have in my python script:

command = "ssh -l root %s /scripts/xen/xen-create-win-vps1.sh %s" %
(host, domuname)
output = os.popen(command)
for line in output:
print line.strip()

Here's a copy of the bash script.

http://watters.ws/script.txt

I also tried using os.spawnv to run ssh in the background and nothing
happens.

Does anybody know a way to make output show in real time?
 
J

jasonmc

Does anybody know a way to make output show in real time?

You can put: #!/usr/bin/python -u
at the top of the script to have unbuffered binary stdout and stderr.
 
T

Thomas Guettler

Hello,

I am trying to write a python cgi that calls a script over ssh, the
problem is the script takes a very long time to execute so Apache
makes the CGI time out and I never see any output. The script is set
to print a progress report to stdout every 3 seconds but I never see
any output until the child process is killed.

Here's what I have in my python script:

command = "ssh -l root %s /scripts/xen/xen-create-win-vps1.sh %s" %
(host, domuname)
output = os.popen(command)
for line in output:
print line.strip()

try sys.stdout.flush() after every print.

Or try something like this:
import sys, time

class FlushFile:
def __init__(self, fd):
self.fd = fd
def flush(self):
self.fd.flush()
def write(self, str):
self.fd.write(str)
self.fd.flush()

oldstdout = sys.stdout
sys.stdout = FlushFile(sys.stdout)

for i in range(5):
print "Hello",
time.sleep(0.5)
print
 
W

wattersmt

You can put: #!/usr/bin/python -u
at the top of the script to have unbuffered binary stdout and stderr.


Thanks. I tried that but it still times out waiting for output.

Everything works fine until I call the popen function, then it
freezes. What I want is to print the output in real time, just like
it does when I run it from a shell.
 
W

wattersmt

Thanks. I tried that but it still times out waiting for output.

Everything works fine until I call the popen function, then it
freezes. What I want is to print the output in real time, just like
it does when I run it from a shell.


I tried flushing stdout and the same thing happens. As soon as the
os.popen(command) line runs it stops there, the next print statement
never even runs.

I've also tried using os.spawnv to make the process run in the
background but then the ssh command never runs.
 
D

Dennis Lee Bieber

Everything works fine until I call the popen function, then it
freezes. What I want is to print the output in real time, just like
it does when I run it from a shell.

And you want /this/ in a web page?

I don't think HTTP is designed for that... As I understand it, it
expects to get a complete page back and then the transaction is complete
and forgotten (except for the presence of session cookies). To report
dynamically on a web page tends to either be something like a
timed-redirect (reload) of the same URL with the cookie, and that is a
completely separate transaction starting a new CGI (or equivalent)
process. AJAX techniques may clean up some of this -- by not really
reloading the whole page, instead updating the DOM based upon data
transferred.

Unfortunately, I don't know AJAX (I have one or two books, but
haven't read them). Something like the initial CGI sends down the base
page (with Javascript), the JS connects back to the long running process
(which is disconnected from the original CGI -- that process completed
with the transfer of the page) and obtains the updates periodically.

Caveat: as mentioned, I'm not fully up on AJAX so the description
may be quite off... The only thing I'm pretty sure of is that using
straight CGI/HTTP mandates "one-shot" data (if you can put the long
running process in the background, independent of the initiating CGI,
and use a session cookie for ID/status, a timed reload sending the
cookie could be used to query the process for "current status" and
return that)
--
Wulfraed Dennis Lee Bieber KD6MOG
(e-mail address removed) (e-mail address removed)
HTTP://wlfraed.home.netcom.com/
(Bestiaria Support Staff: (e-mail address removed))
HTTP://www.bestiaria.com/
 
E

Erik Max Francis

I tried flushing stdout and the same thing happens. As soon as the
os.popen(command) line runs it stops there, the next print statement
never even runs.

I've also tried using os.spawnv to make the process run in the
background but then the ssh command never runs.

Based on what you describe, this isn't a good application for a
single-transaction CGI exchange. The timeouts are not happening at the
level of your CGI script, but rather either at the HTTP server itself or
at the remote client. In either case, fixing it as a one-transaction,
one-script solution is not going to be very feasible.

A more sensible way to do it is to have one logical page (which could be
the same physical page if you want) which accepts job requests, spawns
them off in the background, and offers a link to a second logical page
which sees if the job has completed -- showing the results if it has --
or refreshes periodically if it hasn't yet.
 
W

wattersmt

And you want /this/ in a web page?

I don't think HTTP is designed for that... As I understand it, it
expects to get a complete page back and then the transaction is complete
and forgotten (except for the presence of session cookies). To report
dynamically on a web page tends to either be something like a
timed-redirect (reload) of the same URL with the cookie, and that is a
completely separate transaction starting a new CGI (or equivalent)
process. AJAX techniques may clean up some of this -- by not really
reloading the whole page, instead updating the DOM based upon data
transferred.


Web pages can show output as it's sent. For testing I created a
script on the server that untars a 600 meg volume, I can see each file
name show up in my browser instantly, just like it should. The other
script I'm trying to run won't show anything until the entire process
is complete and it's just a bunch of echo statements in a for loop,
I'm not sure why they behave differently.
 
E

Erik Max Francis

Web pages can show output as it's sent. For testing I created a
script on the server that untars a 600 meg volume, I can see each file
name show up in my browser instantly, just like it should. The other
script I'm trying to run won't show anything until the entire process
is complete and it's just a bunch of echo statements in a for loop,
I'm not sure why they behave differently.

In a word: buffering.
 
G

Gabriel Genellina

En Tue, 06 Feb 2007 16:44:52 -0300, (e-mail address removed)

If the response does not include a Content-Length header, and has a
Transfer-Encoding: chunked header, then it is sent in chunks (blocks) and
the client is able to process it piece by piece.
See the server docs on how to enable and generate a chunked response. On
Zope 2, by example, it's enough to use response.write().
Web pages can show output as it's sent. For testing I created a
script on the server that untars a 600 meg volume, I can see each file
name show up in my browser instantly, just like it should. The other
script I'm trying to run won't show anything until the entire process
is complete and it's just a bunch of echo statements in a for loop,
I'm not sure why they behave differently.

Are you sure the other process is executing? and not buffered? and you're
reading its output line by line?
 
P

petercable

Hello,

I am trying to write a python cgi that calls a script over ssh, the
problem is the script takes a very long time to execute so Apache
makes the CGI time out and I never see any output. The script is set
to print a progress report to stdout every 3 seconds but I never see
any output until the child process is killed.
Does anybody know a way to make output show in real time?

Try this:

<code>

# test.py
import os
import sys
import time

def command():
for x in range(5):
print x
sys.stdout.flush()
time.sleep(1)

def main():
command = 'python -c "import test; test.command()"'
print 'running: %s' % command
output = os.popen(command, 'r', 1)
while True:
line = output.readline()
if line == '':
break
sys.stdout.write(line)
sys.stdout.flush()

if __name__ == '__main__':
main()

</code>

The problem is with using the file-like object returned by popen as an
iterator. It will block until the child process is killed, so just
iterate across it manually.

Pete
 
P

petercable

output = os.popen(command, 'r', 1)

OOPS... I imagine the ridiculous buffer size is unnecessary... I was
trying to get it to work with the original for loop iterating on
output, it should work fine without it.

Pete
 
J

Jordan

Hello,

I am trying to write a python cgi that calls a script over ssh, the
problem is the script takes a very long time to execute so Apache
makes the CGI time out and I never see any output. The script is set
to print a progress report to stdout every 3 seconds but I never see
any output until the child process is killed.

Here's what I have in my python script:

command = "ssh -l root %s /scripts/xen/xen-create-win-vps1.sh %s" %
(host, domuname)
output = os.popen(command)
for line in output:
print line.strip()

Here's a copy of the bash script.

http://watters.ws/script.txt

I also tried using os.spawnv to run ssh in the background and nothing
happens.

Does anybody know a way to make output show in real time?

Just a little note: os.popen has been replaced by the subprocess
module. ;D
 
K

Karthik Gurusamy

Hello,

I am trying to write a python cgi that calls a script over ssh, the
problem is the script takes a very long time to execute so Apache
makes the CGI time out and I never see any output. The script is set
to print a progress report to stdout every 3 seconds but I never see
any output until the child process is killed.

Here's what I have in my python script:

command = "ssh -l root %s /scripts/xen/xen-create-win-vps1.sh %s" %
(host, domuname)
output = os.popen(command)

Apart from other buffering issues, it could be very well that ssh
returns all the output in one single big chunk. Try running the ssh
command (with the trailing 'command') from your shell and see if it
generates output immediately.

There may be some option to make ssh not buffer the data it reads from
the remove command execution. If there is no such option, most likely
you are out of luck. In this case, even if you making your remote
script unbufferred, ssh may be buffering it.

If both the machines have any shared filesystem, you can do a trick.
Make your script write it's output unbuffered to a file. Since the
file is mounted and available on both the machines.. start reading the
file from this main python script (note that you may need a thread to
do it, as your script will anyway be stuck waiting for the ssh to
complete).

Karthik
 
T

tleeuwenburg

Apart from other buffering issues, it could be very well that ssh
returns all the output in one single big chunk. Try running the ssh
command (with the trailing 'command') from your shell and see if it
generates output immediately.

There may be some option to make ssh not buffer the data it reads from
the remove command execution. If there is no such option, most likely
you are out of luck. In this case, even if you making your remote
script unbufferred, ssh may be buffering it.

If both the machines have any shared filesystem, you can do a trick.
Make your script write it's output unbuffered to a file. Since the
file is mounted and available on both the machines.. start reading the
file from this main python script (note that you may need a thread to
do it, as your script will anyway be stuck waiting for the ssh to
complete).

Karthik


You could also try flushing the buffer after each status message
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,997
Messages
2,570,240
Members
46,828
Latest member
LauraCastr

Latest Threads

Top