C
ccc31807
My big data file looks like this:
1,al
2,becky
3,carl
4,debbie
5,ed
6,frieda
.... for perhaps 200K or 300k lines
My change file looks like this:
5, edward
.... for perhaps ten or twelve lines
My script looks like this (SKIPPING THE DETAILS):
my %big_data_hash;
while (<BIG>) { my ($id, $name) = split; $big_data_hash{$id} = $name; }
while (<CHANGE>) { my ($id, $name) = split; $big_data_hash{$id} = $name; }
foreach my $id (keys %big_data_hash)
{ print OUT qq($id,$big_data_hash{$id}\n); }
This seems wasteful to me, loading several hundred thousand lines of data in memory just to make a few changes. Is there any way to tie the data file to a hash and make the changes directly?
Does anyone have any better ideas?
Thanks, CC.
1,al
2,becky
3,carl
4,debbie
5,ed
6,frieda
.... for perhaps 200K or 300k lines
My change file looks like this:
5, edward
.... for perhaps ten or twelve lines
My script looks like this (SKIPPING THE DETAILS):
my %big_data_hash;
while (<BIG>) { my ($id, $name) = split; $big_data_hash{$id} = $name; }
while (<CHANGE>) { my ($id, $name) = split; $big_data_hash{$id} = $name; }
foreach my $id (keys %big_data_hash)
{ print OUT qq($id,$big_data_hash{$id}\n); }
This seems wasteful to me, loading several hundred thousand lines of data in memory just to make a few changes. Is there any way to tie the data file to a hash and make the changes directly?
Does anyone have any better ideas?
Thanks, CC.