Strange behavior after reading big file

J

J Krugman

I have a piece of code that looks something like this:

my %hash;
my $i = 0;
open HUGE, 'Huge_File' or die '>*choke*<';
$| = 1; print "\n";
for my $x (<HUGE>) {
chomp $x;
$hash{$x} = $i++ if some_test($x);
print "\rProcessed up to line $i " unless $i % 10000;
}
close HUGE;
print "Done reading Huge_File\n";

# ... do something with %hash

When I run this code, it dutifully reports processing the total
number of lines (to the nearest 10000), but it never prints "Done
reading ... ". As far as I can tell it just hangs. The code works
fine if the input file is of a "normal" size (1000 lines, say),
but I need to run it on a file that is 25 million lines long, and
it is in this case that I observe the hanging behavior I've just
described.

Any suggestions would be most welcome. FWIW the OS is Linux, and
I'm running Perl 5.6.1, and have 0.5G of RAM.

Thanks!

-Jill

Huge_File has about 25 million lines. The reporter statement
 
G

Guest

I have created a table from my CGI program with data coming from Postgres
and I want to now display this table as an Excel file. How can this be done?
Thanks!

Kevin
 
T

Tad McClellan

J Krugman said:
for my $x (<HUGE>) {

but I need to run it on a file that is 25 million lines long,


Then don't read the entire thing into memory:

while ( my $x = <HUGE> ) {
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
474,137
Messages
2,570,795
Members
47,342
Latest member
eixataze

Latest Threads

Top