C
Calvine Chew
I've been trying to write a script that will allow me to simultaneously
update several websites in a company LAN environment, across multiple
divisions, just by updating an initial one.
Basically:
1) I upload an update (say a new dataset in a zip file) to website A.
2) Script in website A saves it to disk, unpacks it nicely, then sends the
zip off to website B before deleting it from the disk,
3) Website B does the same and sends it off to website C.
4) This process is repeated until it reaches an end node.
Obviously the scripts on each site have been pre-programmed to hop to the
next server, so the hopping logic is done. I basically use LWP to post from
HTML form to HTML form until the last form which just saves the file and
unpacks it.
This works beautifully but because I use LWP, it only works for perl
installations where the LWP/HTML/HTTP modules are make'd and install'd
properly (pls correct me if I'm wrong on that). I can't seem to get it to
work on servers where I did not make and install the required modules. Carp
shows that it can't find loadable module for HTML:arser in the available
library paths even tho I've already used "use lib". In fact, it seems the
offenders are HTML:arser and HTML::Entities.
I'm suspecting this is because LWP/Parser/etc (I think Entities.pm too) uses
XS and other files and binaries besides the .pm files. So does anyone know
how I should go about doing a proper local non-make/install installation of
LWP/HTML/HTTP on the problematic server(s) or is there another way to look
at this problem/issue (perhaps by directly interfacing with each perl script
without using LWP?? But not sure how to do that... socket/port connections
are not available to me)?
Any comments or suggestions are appreciated!
update several websites in a company LAN environment, across multiple
divisions, just by updating an initial one.
Basically:
1) I upload an update (say a new dataset in a zip file) to website A.
2) Script in website A saves it to disk, unpacks it nicely, then sends the
zip off to website B before deleting it from the disk,
3) Website B does the same and sends it off to website C.
4) This process is repeated until it reaches an end node.
Obviously the scripts on each site have been pre-programmed to hop to the
next server, so the hopping logic is done. I basically use LWP to post from
HTML form to HTML form until the last form which just saves the file and
unpacks it.
This works beautifully but because I use LWP, it only works for perl
installations where the LWP/HTML/HTTP modules are make'd and install'd
properly (pls correct me if I'm wrong on that). I can't seem to get it to
work on servers where I did not make and install the required modules. Carp
shows that it can't find loadable module for HTML:arser in the available
library paths even tho I've already used "use lib". In fact, it seems the
offenders are HTML:arser and HTML::Entities.
I'm suspecting this is because LWP/Parser/etc (I think Entities.pm too) uses
XS and other files and binaries besides the .pm files. So does anyone know
how I should go about doing a proper local non-make/install installation of
LWP/HTML/HTTP on the problematic server(s) or is there another way to look
at this problem/issue (perhaps by directly interfacing with each perl script
without using LWP?? But not sure how to do that... socket/port connections
are not available to me)?
Any comments or suggestions are appreciated!