M
Majnu
Hello,
I have a strange problem. I have a flat file of about 6 million rows,
each row of 600 bytes. I read the file line by line after opening like
this:
open(IN, "cat $InputFile |") or die "Failed to open the File";
##This was changed from open(IN, "< $InputFile") because Perl
outrightly refused to open the file.
while(<IN>) {......
The problem is that, at times, Perl just stops after reading 3,334,601
records. No error message printed. And this is not a problem whih
occurs always. It just happens sporadically and hence difficult to
track because, if I re-process the file, it gets read completely.
Would someone please shed light on how this could be happening? Is
this something related with memory?
I have a strange problem. I have a flat file of about 6 million rows,
each row of 600 bytes. I read the file line by line after opening like
this:
open(IN, "cat $InputFile |") or die "Failed to open the File";
##This was changed from open(IN, "< $InputFile") because Perl
outrightly refused to open the file.
while(<IN>) {......
The problem is that, at times, Perl just stops after reading 3,334,601
records. No error message printed. And this is not a problem whih
occurs always. It just happens sporadically and hence difficult to
track because, if I re-process the file, it gets read completely.
Would someone please shed light on how this could be happening? Is
this something related with memory?