B
Brandon Hoppe
Hi,
I'm trying to find a faster way to do this. Right now I have a perl script that contains
function definitions. I have second perl script that contains calls to these functions.
Now the first perl script will execute the second perl script hundreds of times, in order
to call these functions.
So I have a loop that repeatadly changes global variables and then executes the second
perl script with a do $file; call.
This works and all, but its not very fast. It takes about 1 1/2 seconds to execute the
second script, so over hundreds of runs, you can see why it takes forever.
I was wondering if there's a faster way to repeatedly re-call all these functions. Like
reading the second script into an array and then "executing" the array. I doubt this is
possible but I was thinking something along this line, to get the file into memory to
execute over and over again which should be faster than to be reading from disk everytime.
Brandon
I'm trying to find a faster way to do this. Right now I have a perl script that contains
function definitions. I have second perl script that contains calls to these functions.
Now the first perl script will execute the second perl script hundreds of times, in order
to call these functions.
So I have a loop that repeatadly changes global variables and then executes the second
perl script with a do $file; call.
This works and all, but its not very fast. It takes about 1 1/2 seconds to execute the
second script, so over hundreds of runs, you can see why it takes forever.
I was wondering if there's a faster way to repeatedly re-call all these functions. Like
reading the second script into an array and then "executing" the array. I doubt this is
possible but I was thinking something along this line, to get the file into memory to
execute over and over again which should be faster than to be reading from disk everytime.
Brandon