Backup--quick, cheap, good/fast, pick all 3
Doug Sweetser
sweetser at TheWorld.com
Fri Sep 5 16:09:44 EDT 2003
Hello:
If one has music or video files to back up, that will require tapes
or other hard drives. One might decide to have a backup system for
REALLY BIG THINGS, and another for all the text and images which are
smaller than 600MB. That can be burned to a CD. If your machine
doesn't have one, those go for about $30-50. If you need something in
the 4GB range, a DVD burner goes for $150, and less as time goes on.
Perl was designed to make chores easy. I wrote a simple script called
"backup". It calls another simple program to toss away things like
emacs backup files. Another little program on my Debian system records
every package I have installed. It makes an iso image, and then burns
it to CD. (If someone has an idea about another "good practice" to
do, please tell me). Although I am a bit sloppy, I like my chore
programs to always follow best practices. Backups can be made easy,
honest.
doug
#!/usr/bin/perl -w
$|++;
## Name: backup
## Author: Douglas Sweetser
## sweetser at alum.mit.edu
## License: GPL, info at end
### Program description
my $help_string = <<HELP;
Does all backup chores.
Usage: backup -dir name
backs up: /home /etc /var/lib/tripwire /var/www by default.
HELP
### Algorithm
# Run a few commands.
### Modules
use strict;
use English;
use Getopt::Long;
$Getopt::Long::autoabbrev = 1;
$Getopt::Long::ignorecase = 0;
### Variables
my $line;
my $run;
my $test;
my @dirs;
my @backup_dirs = qw (/home /etc /var/lib/tripwire /var/www);
my $today;
$today = `/usr/local/bin/yyyymmdd`;
chomp $today;
my $help;
my $QA = 0;
### Main
__get_data();
# Eliminate junk.
print STDOUT "run find_junk_rm? [y] ";
$run = <STDIN>;
system ("/usr/local/bin/find_junk_rm") unless ($run =~ /^n/i);
# Record selected packages.
unless (-e "/home/sweetser/Mail/OS/Linux/selections.$today")
{
print STDOUT "Recording selected packages.\n";
system ("/usr/bin/dpkg --get-selections > /home/sweetser/Mail/OS/Linux/selections.$today");
}
# Make iso image for cd.
print STDOUT "run mkisofs? [y] ";
$run = <STDIN>;
system ("/usr/bin/mkisofs -r -o $today.iso @backup_dirs") unless ($run =~ /^n/i);
# Burn the cd.
print STDOUT "run cdrecord? [y] ";
$run = <STDIN>;
system ("/usr/bin/nice --18 /usr/bin/cdrecord -eject -v speed=2 dev=0,0,0 -multi -data -pad $today.iso") unless ($run =~ /^n/i);
### Signals
exit(0);
### Subroutines
# Get data, assign to variables.
sub __get_data {
my $get = GetOptions("dirs=s" => \@dirs, "help" => \$help, "QA" => \$QA);
die ("Check options please.\nProgram exiting.\n") unless $get;
if ($help) {
print $help_string;
exit(1);
}
## Check if user is root.
die ("Must be root.\nProgram exiting.\n") unless ($ENV{USER} eq "root");
## Add to list of directories.
push @backup_dirs, @dirs if (defined ($dirs[0]));
print STDERR "Will be backing up: @backup_dirs.\n";
}
### Perldoc
=pod
=head1 NAME
backup - Automates data backup
=head1 DESCRIPTION
Runs relevant commands.
=head1 USAGE
Usage: backup -dir name
backs up: /home /etc /var/lib/tripwire /var/www by default.
=head1 OPTIONS
=item I<-dir name>
Adds directories to be backed up. Modify this perl script if a directory should regularly be backed up.
=item I<-help>
Prints help message.
=head1 AUTHOR
Doug Sweetser at alum.mit.edu 2003. All rights reserved.
=cut
### License
# # This program is free software; you can redistribute it and/or modify
# # it under the terms of the GNU General Public License as published by
# # the Free Software Foundation, version 2 of the License.
# # This program is distributed in the hope that it will be useful,
# # but WITHOUT ANY WARRANTY; without even the implied warranty of
# # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# # GNU General Public License for more details.
# # You should have received a copy of the GNU General Public License
# # along with this program; if not, write to the Free Software
# # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
[Note: the reason for these odd #'s is that grep # backup shows the
structure and comments in the program.]
More information about the Discuss
mailing list