T
tak
Hello,
Here is the partial code that reads the data from a txt file.
open(mainfile) or die("Could not open master file
'$mainfile'.");
foreach my $line (<mainfile>) {
$i++;
chomp($line);
my @values = split(/\|/, $line);
$Master_Hash{$values[3]} = \@values;
if ($i % 10000 == 0) {
#print ("loaded $i lines in hash so far - last entry was:
$values[3] \n");
my $size = keys(%Master_Hash);
my $scalarSize = scalar %Master_Hash;
print "Loaded $i entries - #ofKeys: $size - ScalarSize:
$scalarSize\n";
}
}
And each line is about 420 characters.
$ wc -l -L 07*file
238348 449 07302006file
After it finishes loading these 240k lines into the hash - the xp task
manager reports 1.91 GB of usage.
-Tak
Here is the partial code that reads the data from a txt file.
open(mainfile) or die("Could not open master file
'$mainfile'.");
foreach my $line (<mainfile>) {
$i++;
chomp($line);
my @values = split(/\|/, $line);
$Master_Hash{$values[3]} = \@values;
if ($i % 10000 == 0) {
#print ("loaded $i lines in hash so far - last entry was:
$values[3] \n");
my $size = keys(%Master_Hash);
my $scalarSize = scalar %Master_Hash;
print "Loaded $i entries - #ofKeys: $size - ScalarSize:
$scalarSize\n";
}
}
And each line is about 420 characters.
$ wc -l -L 07*file
238348 449 07302006file
After it finishes loading these 240k lines into the hash - the xp task
manager reports 1.91 GB of usage.
-Tak