J
Justin C
My web-hosts are running perl 5.8.8, other software there is of a
similar age, and some things are missing (I wanted to 'nice' my
program, but there is no 'nice').
I have written a backup program to tar and gzip my entire directory
tree on their site, and also to dump the db and add that to the tar.
The program I have written runs one of my cores at 100% for two
minutes, and uses almost 100MB RAM. If there is a way I'd like to
reduce this load (as I can't 'nice' it).
I haven't tried running the program yet, I don't want to get a
bad name for maxing out the hardware. I've used core modules only,
and I've used them as per documentation for the versions that were
part of 5.8.8. I've pasted the code below, I'd be grateful for
suggestions on how I could do the same while putting as little
load on the server as possible.
~ $ cat bin/wp-backup.pl
#!/usr/bin/perl
use warnings;
use strict;
use Archive::Tar;
use File::Find;
# global vars
chomp (my $now = `date +"%Y-%m-%d-%H%M"`);
my $tar;
my $file = "site.com.$now.tar.gz";
my $backup_dir = '/var/sites/s/site.com/backups';
create_archive();
my $db = extract_db_data();
$tar->add_files($db);
$tar->write($archive, 9);
sub archive_it {
my $new_name = 'public_html/' . $_;
(my $old_name = $File::Find::name) =~ s/^\///;
$tar->add_files($File::Find::name);
$tar->rename($old_name, $new_name);
}
sub create_archive {
my $www_dir = '/var/sites/s/site.com/public_html';
$tar = Archive::Tar->new; # declared in globals
find(\&archive_it, $www_dir); # &archive_it adds it to the tar
$tar->write($archive);
}
sub extract_db_data {
my $db = {
user => 'name',
pass => 'password',
name => 'db',
file => "site.com.$now.sql",
host => '1.0.0.0',
};
my @args = ('mysqldump', '--add-drop-table', '--complete-insert',
'--extended-insert', '--hex-blob', "--host $db->{host}", "--user=$db->{user}",
"--password=$db->{pass}", $db->{name}, '>', "$backup_dir/$db->{file}");
system @args == 0 or die "problem running mysqldump: $!";
return $db_file;
}
__END__
Thankyou for any help or suggestions.
Justin.
similar age, and some things are missing (I wanted to 'nice' my
program, but there is no 'nice').
I have written a backup program to tar and gzip my entire directory
tree on their site, and also to dump the db and add that to the tar.
The program I have written runs one of my cores at 100% for two
minutes, and uses almost 100MB RAM. If there is a way I'd like to
reduce this load (as I can't 'nice' it).
I haven't tried running the program yet, I don't want to get a
bad name for maxing out the hardware. I've used core modules only,
and I've used them as per documentation for the versions that were
part of 5.8.8. I've pasted the code below, I'd be grateful for
suggestions on how I could do the same while putting as little
load on the server as possible.
~ $ cat bin/wp-backup.pl
#!/usr/bin/perl
use warnings;
use strict;
use Archive::Tar;
use File::Find;
# global vars
chomp (my $now = `date +"%Y-%m-%d-%H%M"`);
my $tar;
my $file = "site.com.$now.tar.gz";
my $backup_dir = '/var/sites/s/site.com/backups';
create_archive();
my $db = extract_db_data();
$tar->add_files($db);
$tar->write($archive, 9);
sub archive_it {
my $new_name = 'public_html/' . $_;
(my $old_name = $File::Find::name) =~ s/^\///;
$tar->add_files($File::Find::name);
$tar->rename($old_name, $new_name);
}
sub create_archive {
my $www_dir = '/var/sites/s/site.com/public_html';
$tar = Archive::Tar->new; # declared in globals
find(\&archive_it, $www_dir); # &archive_it adds it to the tar
$tar->write($archive);
}
sub extract_db_data {
my $db = {
user => 'name',
pass => 'password',
name => 'db',
file => "site.com.$now.sql",
host => '1.0.0.0',
};
my @args = ('mysqldump', '--add-drop-table', '--complete-insert',
'--extended-insert', '--hex-blob', "--host $db->{host}", "--user=$db->{user}",
"--password=$db->{pass}", $db->{name}, '>', "$backup_dir/$db->{file}");
system @args == 0 or die "problem running mysqldump: $!";
return $db_file;
}
__END__
Thankyou for any help or suggestions.
Justin.