creating pipelines in python

P

per

hi all,

i am looking for a python package to make it easier to create a
"pipeline" of scripts (all in python). what i do right now is have a
set of scripts that produce certain files as output, and i simply have
a "master" script that checks at each stage whether the output of the
previous script exists, using functions from the os module. this has
several flaws and i am sure someone has thought of nice abstractions
for making these kind of wrappers easier to write.

does anyone have any recommendations for python packages that can do
this?

thanks.
 
L

Lie Ryan

per said:
hi all,

i am looking for a python package to make it easier to create a
"pipeline" of scripts (all in python). what i do right now is have a
set of scripts that produce certain files as output, and i simply have
a "master" script that checks at each stage whether the output of the
previous script exists, using functions from the os module. this has
several flaws and i am sure someone has thought of nice abstractions
for making these kind of wrappers easier to write.

does anyone have any recommendations for python packages that can do
this?

thanks.

You're currently implementing a pseudo-pipeline:
http://en.wikipedia.org/wiki/Pipeline_(software)#Pseudo-pipelines

If you want to create a unix-style, byte-stream-oriented pipeline, have
all scripts write output to stdout and read from stdin (i.e. read with
raw_input and write with print). Since unix pipeline's is byte-oriented
you will require parsing the input and formatting the output from/to an
agreed format between each scripts. A more general approach could use
more than two streams, you can use file-like objects to represent stream.

For a more pythonic pipeline, you can rewrite your scripts into
generators and use generator/list comprehension that reads objects from
a FIFO queue and write objects to another FIFO queue (queue can be
implemented using list, but take a look at Queue.Queue in standard
modules). Basically an Object Pipeline:
http://en.wikipedia.org/wiki/Pipeline_(software)#Object_pipelines

For unix-style pipeline, you shell/batch scripts is the best tool,
though you can also use subprocess module and redirect the process's
stdin's and stdout's. For object pipeline, it can't be simpler than
simply passing an input and output queue to each scripts.

For in-script pipelines (c.f. inter-script pipeline), you can use
generator/list comprehension and iterators. There are indeed several
modules intended for providing slightly neater syntax than
comprehension: http://code.google.com/p/python-pipeline/ though I
personally prefer comprehension.
 
R

Robert Kern

per said:
hi all,

i am looking for a python package to make it easier to create a
"pipeline" of scripts (all in python). what i do right now is have a
set of scripts that produce certain files as output, and i simply have
a "master" script that checks at each stage whether the output of the
previous script exists, using functions from the os module. this has
several flaws and i am sure someone has thought of nice abstractions
for making these kind of wrappers easier to write.

does anyone have any recommendations for python packages that can do
this?

You may want to try joblib or ruffus. I haven't had a chance to evaluate either
one, though.

http://pypi.python.org/pypi/joblib/
http://pypi.python.org/pypi/ruffus/

--
Robert Kern

"I have come to believe that the whole world is an enigma, a harmless enigma
that is made terrible by our own mad attempt to interpret it as though it had
an underlying truth."
-- Umberto Eco
 
P

Paul Rudin

per said:
hi all,

i am looking for a python package to make it easier to create a
"pipeline" of scripts (all in python). what i do right now is have a
set of scripts that produce certain files as output, and i simply have
a "master" script that checks at each stage whether the output of the
previous script exists, using functions from the os module. this has
several flaws and i am sure someone has thought of nice abstractions
for making these kind of wrappers easier to write.

does anyone have any recommendations for python packages that can do
this?

Not entirely what you're looking for, but the subprocess module is
easier to work with for this sort of thing than os. See e.g. <http://docs.python.org/library/subprocess.html#replacing-shell-pipeline>
 
W

Wolodja Wentland

i am looking for a python package to make it easier to create a
"pipeline" of scripts (all in python). what i do right now is have a
set of scripts that produce certain files as output, and i simply have
a "master" script that checks at each stage whether the output of the
previous script exists, using functions from the os module. this has
several flaws and i am sure someone has thought of nice abstractions
for making these kind of wrappers easier to write.
does anyone have any recommendations for python packages that can do
this?

There are various possibilities. I would suggest you have a look at [1]
which details the creation of pipelines with generators that can be used
within *one* program. If you want to chain different programs together
you can use the subprocess package in the stdlib of Python 2.6.

[1] http://www.dabeaz.com/generators/
--
.''`. Wolodja Wentland <[email protected]>
: :' :
`. `'` 4096R/CAF14EFC
`- 081C B7CD FF04 2BA9 94EA 36B2 8B7F 7D30 CAF1 4EFC

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.10 (GNU/Linux)

iQIcBAEBCAAGBQJLClPjAAoJEIt/fTDK8U789SEQAKn8K4+jqD8zGyLL+9RSefsj
LTOe8C6JNeOfhkEqkgdZnN3M6NjP/7RdC3DAGbhv5F87wZvYaASzjzk7fSVyPR7V
nqpdv+BeO17mDpOfqlVT03UL80gYKmLplyXnorSYLLo6Qu5yT6jXsMe6GfSQYVlZ
mXbDrmoFznyBJdzf4sA+oepiYrb8fq7K/WzjnqWwsVV6tEl90yJUR/Hx1eNYhLn0
iXErObgOFIUc5iF0R1m9YCz/YCr2N49s1j4Ic4WUGIrYaoKOtvnsFA8GDGJUM3Sk
DyJB9Sjoxwh8N1an1CtOLKssE9UdvCwxsaSIlUAax+R9eQdQgAzGG6+5mptmvBZl
6QOVusbKukJd11rbShwz+txJz83CMtv9uRe2DCi4+kGu5MqEn9Q9Nyu+p1aUHTLc
acxnMh789tjeQDKpS2X8BGCV2nYVZSlbNG8jlo1Nx80NtoOGAVAwaFeUQSFK08VB
vqeFwQugAcA6yopDOk2dMyrZU+M2UzlyXAwWuyBMRw16wmx6xHDGilBKIcR+QNk/
K4LYwKoTBL0yYNdN8FptunJ1HcgHDfhHc5sGbVycI/+7zyjfVgkh0SCY5eF+QreZ
kKf+KWCVxAmgqXrZNswlBANcG25qwKH9SAOq3R8kDi7RYRvcGHafIjAg3lFj9pca
yYfet2t/kM9Y+++Ste4R
=VxGp
-----END PGP SIGNATURE-----
 
P

per

Thanks to all for your replies. i want to clarify what i mean by a
pipeline. a major feature i am looking for is the ability to chain
functions or scripts together, where the output of one script -- which
is usually a file -- is required for another script to run. so one
script has to wait for the other. i would like to do this over a
cluster, where some of the scripts are distributed as separate jobs on
a cluster but the results are then collected together. so the ideal
library would have easily facilities for expressing this things:
script X and Y run independently, but script Z depends on the output
of X and Y (which is such and such file or file flag).

is there a way to do this? i prefer not to use a framework that
requires control of the clusters etc. like Disco, but something that's
light weight and simple. right now ruffus seems most relevant but i am
not sure -- are there other candidates?

thank you.
 
S

Stefan Behnel

per, 25.11.2009 17:42:
Thanks to all for your replies. i want to clarify what i mean by a
pipeline. a major feature i am looking for is the ability to chain
functions or scripts together, where the output of one script -- which
is usually a file -- is required for another script to run. so one
script has to wait for the other. i would like to do this over a
cluster, where some of the scripts are distributed as separate jobs on
a cluster but the results are then collected together. so the ideal
library would have easily facilities for expressing this things:
script X and Y run independently, but script Z depends on the output
of X and Y (which is such and such file or file flag).

is there a way to do this? i prefer not to use a framework that
requires control of the clusters etc. like Disco, but something that's
light weight and simple. right now ruffus seems most relevant but i am
not sure -- are there other candidates?

As others have pointed out, a Unix pipe approach might be helpful if you
want the processes to run in parallel. You can send the output of one
process to stdout, a network socket, an HTTP channel or whatever, and have
the next process read it and work on it while it's being generated by the
first process.

Looking into generators is still a good idea, even if you go for a pipe
approach. See the link posted by Wolodja Wentland.

Stefan
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,982
Messages
2,570,190
Members
46,736
Latest member
zacharyharris

Latest Threads

Top