Yohan said:
The sub I want to fork, async or Win32:
rocess is quite long, but
mainly, it does calculations against input user (collected by main
process) and data already written in server files, then update these
same server files with new results. However, durin,g this time, main
process has to continue and doesn't wait for anything about this
secondary process (because of this, I indicated DETACHED_PROCESS in
Win32:
rocess, $SIG{CHLD} = 'IGNORE' when I gone through fork(), and
detach() about threads).
First things first, this is a CGI program, so let's keep it simple, if we
can. The CGI program is getting the new data submitted by the user, and
that data needs to be processed (short or long), and the server files are
to be updated with the new results. Apparently, the CGI program, having
received the new data, doesn't need to wait around for that long background
processing, so you just want to queue that long background processing and
return some HTML content to the user. Right? No problem.
Do you want to ensure that your long background processing is always
completed, before receiving the next CGI submission of more new data to be
calculated against the server file data, which may or may not have finished
updating from the last session? You may need to develop a means to verify
that the current server file data that you're using in your short
"on-the-fly" calculations is completely up to date. How many user sessions
will be occurring concurrently? Maybe you will actually want to wait for
the new data to be fully processed, before returning that HTML content to
the user, or users.
What I want is that this long calculation/file_writing sub be launched
in an asychonous manner. Also, I don"'t want to put this sub in a second
script because it may be call in another place of the main process, but
with sub parameters doing it's a short computing this time ; in others
wod, depending of the sub parameters, this sub will be long or short.
And for the long call (one only in the entire script), I wish it be a
non blocking one : fork and or threads was ideal for this... But...
Seems to me like you need to create a new Perl module that contains your
reusable (short and long) processing code.
One program using that new Perl module of your's is the CGI program
performing the short "on-the-fly" processing, and another new program of
your's, to be developed to run in the background, can use that same new
Perl module of your's, to watch your server file queue for the long "batch"
processing.
If the server file queue is empty and unlocked, then you can safely assume
that your server files are up to date for the short "on-the-fly" processing
to be done accurately in the CGI program. You'll need to develop a locking
mechanism to handle the concurrency issues of multiple processes and
multiple users.
I hope this helps.
Eric