A
Alan J. Flavell
Folks, At risk of being ruled OT, I'd appreciate guidance for the
following kind of task which we need to solve, which I'm proposing to
do from an Active Perl script. Assume Win2K or similar (NT4 maybe).
A particular piece of software on laptops is to be kept updated "over
the network" from a distribution server. The laptops are only
intermittently active, and even when they're active they may or may
not have network access.
The network copy that they're to be kept updated against will be
changed (let's say) once a day. So: once an update has been
successful, it's pointless to run it again until the next distribution
is due (although I suppose it would be OK for a low-fat script to
start up, take a look at some last-successful-run timestamp, and then
calmly exit). But we want that if the laptop goes online even once in
a day, then we don't want to miss that day's update.
However, if we set the task scheduler to run the script every couple
of hours, it could well go for days before the scheduled run happens
to hit a moment when the laptop is active and has network access.
If we set it to run every few minutes, the users complain of the task
disrupting their work. Especially as - despite much hunting around -
I don't seem to be able to get the Perl script to run without
flashing-up a DOS window briefly (there must be an answer to that but
somehow I can't find it - you can tell that I'm no great Windows fan)
Is the task scheduler the right approach to this at all, or should I
perhaps be looking to run the script as a system service or something?
This kind of job doesn't seem to be to be an unusual requirement, I
feel sure someone has successfully done something similar. Please?
following kind of task which we need to solve, which I'm proposing to
do from an Active Perl script. Assume Win2K or similar (NT4 maybe).
A particular piece of software on laptops is to be kept updated "over
the network" from a distribution server. The laptops are only
intermittently active, and even when they're active they may or may
not have network access.
The network copy that they're to be kept updated against will be
changed (let's say) once a day. So: once an update has been
successful, it's pointless to run it again until the next distribution
is due (although I suppose it would be OK for a low-fat script to
start up, take a look at some last-successful-run timestamp, and then
calmly exit). But we want that if the laptop goes online even once in
a day, then we don't want to miss that day's update.
However, if we set the task scheduler to run the script every couple
of hours, it could well go for days before the scheduled run happens
to hit a moment when the laptop is active and has network access.
If we set it to run every few minutes, the users complain of the task
disrupting their work. Especially as - despite much hunting around -
I don't seem to be able to get the Perl script to run without
flashing-up a DOS window briefly (there must be an answer to that but
somehow I can't find it - you can tell that I'm no great Windows fan)
Is the task scheduler the right approach to this at all, or should I
perhaps be looking to run the script as a system service or something?
This kind of job doesn't seem to be to be an unusual requirement, I
feel sure someone has successfully done something similar. Please?