J
jl_post
Dear Perl community,
A handful of times I've had to edit certain files in a given
directory. For example, I might have to append a blank line to all
*.txt files localted in a certain directory (and all its
subdirectories).
For this task, I use the File::Find module, like this:
use File::Find;
find(\&wanted, @directories_to_search);
sub wanted { ... }
As for the wanted function, I could define it like this:
sub wanted
{
# Skip non-text files:
return unless -f and m/\.txt$/;
# Append a newline to text file:
open(OUT, ">>$_") or die "Error with '$_': $!";
print OUT "\n";
close(OUT);
}
or I could define it like this:
sub wanted
{
# Skip non-text files:
return unless -f and m/\.txt$/;
# Rename file to *.bak:
rename($_, "$_.bak") or die "Cannot rename '$_': $!";
# Read *.bak file into *.txt file and add newline:
open(IN, "<$_.bak") or die "Error reading '$_': $!";
open(OUT, ">$_") or die "Error writing to '$_': $!";
print OUT <IN>, "\n";
close(OUT);
close(IN);
unlink("$_.bak") or die "Cannot unlink '$_': $!";
}
In the first definition of &wanted, I simply append to the file
without creating a back-up file. In the second version of &wanted, I
create a back-up file that ends in *.bak .
My main question is this: Since both versions of &wanted modify the
files they find, will they ruin (or throw off) the File::Find
algorithm?
In the first definition of &wanted, no extra files were created, yet
one was modified. Will this cause File::Find to "pick up" the modified
file as a new file and modify it again?
In the second definition of &wanted, an extra file was created for
every file modified. The first file was not deleted, but it was
entirely overwritten.
Does anyone have any input on this? I read through "perldoc
File::Find" for mention of "re-picking-up" file entries that were
already modified, but I couldn't find any.
I'm wondering if this is a machine-specific issue. If it is, it
would probably be safest if I wrote the &wanted function like this:
my @files;
sub wanted
{
# Skip non-text files:
return unless -f and m/\.txt$/;
push @files, $File::Find::name;
}
and then loop through the @files array, like this:
foreach (@files)
{
# Append a newline to text file:
open(OUT, ">>$_") or die "Error with '$_': $!";
print OUT "\n";
close(OUT);
}
That way I won't modify anything until I found all the files to
process.
So should I stick to the last method I mentioned, or am I worrying
for nothing? (Or is there a better method I'm not aware of yet?)
Thanks for any input.
-- Jean-Luc
A handful of times I've had to edit certain files in a given
directory. For example, I might have to append a blank line to all
*.txt files localted in a certain directory (and all its
subdirectories).
For this task, I use the File::Find module, like this:
use File::Find;
find(\&wanted, @directories_to_search);
sub wanted { ... }
As for the wanted function, I could define it like this:
sub wanted
{
# Skip non-text files:
return unless -f and m/\.txt$/;
# Append a newline to text file:
open(OUT, ">>$_") or die "Error with '$_': $!";
print OUT "\n";
close(OUT);
}
or I could define it like this:
sub wanted
{
# Skip non-text files:
return unless -f and m/\.txt$/;
# Rename file to *.bak:
rename($_, "$_.bak") or die "Cannot rename '$_': $!";
# Read *.bak file into *.txt file and add newline:
open(IN, "<$_.bak") or die "Error reading '$_': $!";
open(OUT, ">$_") or die "Error writing to '$_': $!";
print OUT <IN>, "\n";
close(OUT);
close(IN);
unlink("$_.bak") or die "Cannot unlink '$_': $!";
}
In the first definition of &wanted, I simply append to the file
without creating a back-up file. In the second version of &wanted, I
create a back-up file that ends in *.bak .
My main question is this: Since both versions of &wanted modify the
files they find, will they ruin (or throw off) the File::Find
algorithm?
In the first definition of &wanted, no extra files were created, yet
one was modified. Will this cause File::Find to "pick up" the modified
file as a new file and modify it again?
In the second definition of &wanted, an extra file was created for
every file modified. The first file was not deleted, but it was
entirely overwritten.
Does anyone have any input on this? I read through "perldoc
File::Find" for mention of "re-picking-up" file entries that were
already modified, but I couldn't find any.
I'm wondering if this is a machine-specific issue. If it is, it
would probably be safest if I wrote the &wanted function like this:
my @files;
sub wanted
{
# Skip non-text files:
return unless -f and m/\.txt$/;
push @files, $File::Find::name;
}
and then loop through the @files array, like this:
foreach (@files)
{
# Append a newline to text file:
open(OUT, ">>$_") or die "Error with '$_': $!";
print OUT "\n";
close(OUT);
}
That way I won't modify anything until I found all the files to
process.
So should I stick to the last method I mentioned, or am I worrying
for nothing? (Or is there a better method I'm not aware of yet?)
Thanks for any input.
-- Jean-Luc