J
jl_post
Hi,
I'm trying to write text to a piped writeHandle (created with the
pipe() function) so that I can later read the text by extracting it
from a readHandle. However, I discovered that if I write a lot of
text to the pipe, my script just hangs. Here is an example program:
#!/usr/bin/perl
use strict;
use warnings;
print "Enter a number: ";
my $number = <STDIN>;
chomp($number);
my @lines = do
{
pipe(my $readHandle, my $writeHandle);
# Autoflush $writeHandle:
my $oldHandle = select($writeHandle);
$| = 1;
select($oldHandle);
print $writeHandle "$_\n" foreach 1 .. $number;
close($writeHandle);
# Extract the output, line-by-line:
<$readHandle>
};
print "Extracted output lines:\n @lines";
__END__
When I run this program, I notice that it runs perfectly for small
values of $number (like 10). But on high values (like ten thousand),
the program hangs.
From testing, I discovered that the limit on the Windows platform
I'm using is 155, and the limit on the Linux platform I'm using is
1040. Any higher number causes the program to hang.
As for why this is happening, my best guess is that a program can
only stuff so much output into a piped writeHandle before it gets
full. Therefore, deadlock occurs, as the reading won't happen until
the writing is finished.
However, I'm not fully convinced this is the case, because I
replaced the lines:
print $writeHandle "$_\n" foreach 1 .. $number;
close($writeHandle);
with:
if (fork() == 0)
{
# Only the child process gets here:
print $writeHandle "$_\n" foreach 1 .. $number;
close($writeHandle);
exit(0);
}
and now the Perl script hangs on both Windows and Linux platforms,
even with low values of $number (such as 5). My intent was to make
the child process solely responsible for stuffing the output into the
pipe, while the parent process read from the $readHandle as data
became available. That way we would avoid the pipe getting stuffed to
capacity.
But as I've said, that fork()ing code change doesn't even work for
any values, so I must be doing something wrong somewhere.
So my question is: How do I prevent my script from hanging when I
have a lot of text to send through the pipe?
Thanks in advance for any help.
-- Jean-Luc
I'm trying to write text to a piped writeHandle (created with the
pipe() function) so that I can later read the text by extracting it
from a readHandle. However, I discovered that if I write a lot of
text to the pipe, my script just hangs. Here is an example program:
#!/usr/bin/perl
use strict;
use warnings;
print "Enter a number: ";
my $number = <STDIN>;
chomp($number);
my @lines = do
{
pipe(my $readHandle, my $writeHandle);
# Autoflush $writeHandle:
my $oldHandle = select($writeHandle);
$| = 1;
select($oldHandle);
print $writeHandle "$_\n" foreach 1 .. $number;
close($writeHandle);
# Extract the output, line-by-line:
<$readHandle>
};
print "Extracted output lines:\n @lines";
__END__
When I run this program, I notice that it runs perfectly for small
values of $number (like 10). But on high values (like ten thousand),
the program hangs.
From testing, I discovered that the limit on the Windows platform
I'm using is 155, and the limit on the Linux platform I'm using is
1040. Any higher number causes the program to hang.
As for why this is happening, my best guess is that a program can
only stuff so much output into a piped writeHandle before it gets
full. Therefore, deadlock occurs, as the reading won't happen until
the writing is finished.
However, I'm not fully convinced this is the case, because I
replaced the lines:
print $writeHandle "$_\n" foreach 1 .. $number;
close($writeHandle);
with:
if (fork() == 0)
{
# Only the child process gets here:
print $writeHandle "$_\n" foreach 1 .. $number;
close($writeHandle);
exit(0);
}
and now the Perl script hangs on both Windows and Linux platforms,
even with low values of $number (such as 5). My intent was to make
the child process solely responsible for stuffing the output into the
pipe, while the parent process read from the $readHandle as data
became available. That way we would avoid the pipe getting stuffed to
capacity.
But as I've said, that fork()ing code change doesn't even work for
any values, so I must be doing something wrong somewhere.
So my question is: How do I prevent my script from hanging when I
have a lot of text to send through the pipe?
Thanks in advance for any help.
-- Jean-Luc