J
J Krugman
I have a piece of code that looks something like this:
my %hash;
my $i = 0;
open HUGE, 'Huge_File' or die '>*choke*<';
$| = 1; print "\n";
for my $x (<HUGE>) {
chomp $x;
$hash{$x} = $i++ if some_test($x);
print "\rProcessed up to line $i " unless $i % 10000;
}
close HUGE;
print "Done reading Huge_File\n";
# ... do something with %hash
When I run this code, it dutifully reports processing the total
number of lines (to the nearest 10000), but it never prints "Done
reading ... ". As far as I can tell it just hangs. The code works
fine if the input file is of a "normal" size (1000 lines, say),
but I need to run it on a file that is 25 million lines long, and
it is in this case that I observe the hanging behavior I've just
described.
Any suggestions would be most welcome. FWIW the OS is Linux, and
I'm running Perl 5.6.1, and have 0.5G of RAM.
Thanks!
-Jill
Huge_File has about 25 million lines. The reporter statement
my %hash;
my $i = 0;
open HUGE, 'Huge_File' or die '>*choke*<';
$| = 1; print "\n";
for my $x (<HUGE>) {
chomp $x;
$hash{$x} = $i++ if some_test($x);
print "\rProcessed up to line $i " unless $i % 10000;
}
close HUGE;
print "Done reading Huge_File\n";
# ... do something with %hash
When I run this code, it dutifully reports processing the total
number of lines (to the nearest 10000), but it never prints "Done
reading ... ". As far as I can tell it just hangs. The code works
fine if the input file is of a "normal" size (1000 lines, say),
but I need to run it on a file that is 25 million lines long, and
it is in this case that I observe the hanging behavior I've just
described.
Any suggestions would be most welcome. FWIW the OS is Linux, and
I'm running Perl 5.6.1, and have 0.5G of RAM.
Thanks!
-Jill
Huge_File has about 25 million lines. The reporter statement