E
ed
Ok, first off I should preface this by saying I've never forked
before.
Guess you could call me a virgin
Anyways, I'm used to writing scripts (mainly PHP) that _don't_ execute
for a very
long time, and don't really need to do anything with process creation
or management.
So forgive me if this is a stupid question.
Here's the scenario:
I want to run some code in a loop that runs through sub directories
of a bunch of different users and does stuff to their
files/directories.
Now normally I would traverse each user's directory one at a time.
But could I get the job done quicker if I ran multiple processes and
had
several users' directories being traversed at the same time?
I'd have to write a little extra code to "schedule" the processes to
make
sure that all the users' directories were being traversed an equal
amount
of times(because the loop will run until it's sent a signal, so these
directories get
traversed over and over again). But that probably won't be a big
deal.
Could I get the job done quicker by running multiple processes?
--ed
before.
Guess you could call me a virgin
Anyways, I'm used to writing scripts (mainly PHP) that _don't_ execute
for a very
long time, and don't really need to do anything with process creation
or management.
So forgive me if this is a stupid question.
Here's the scenario:
I want to run some code in a loop that runs through sub directories
of a bunch of different users and does stuff to their
files/directories.
Now normally I would traverse each user's directory one at a time.
But could I get the job done quicker if I ran multiple processes and
had
several users' directories being traversed at the same time?
I'd have to write a little extra code to "schedule" the processes to
make
sure that all the users' directories were being traversed an equal
amount
of times(because the loop will run until it's sent a signal, so these
directories get
traversed over and over again). But that probably won't be a big
deal.
Could I get the job done quicker by running multiple processes?
--ed