[svlug] What's A Good Program To Backup Hard Disk To CD-Rs?
Mark S Bilk
mark at cosmicpenguin.com
Thu Jan 1 18:32:46 PST 2004
I've been doing backups with BRU to 5GB Exabyte 8mm. tapes.
It's been pointed out to me that obviously BRU might stop working
at some point due to kernel changes, and its backups in secret
proprietary format would become unreadable. Also the version
I have may not even be able to write to CD-Rs.
So, please pardon my naivete, what is a good open-source program
for doing this (I'll graduate to DVD-Rs eventually)? I don't
mind sitting there and putting in a new CD-R blank every two
minutes or so (for now). At least they are 1/5 the price and
ten times the speed of 8mm. tapes.
And are those spindles of 50 GQ brand 52X CD-Rs that Frye's
sells on bargain days for $7 reliable?
38GB of my files are non-compressible, and 4GB are partly
compressible (text and binaries), so compression would only
save me a couple of CD-Rs and would compromise reliability
by allowing a small error to possibly kill a whole big text
file. So I don't want or need compression.
Basically I just want to back up a selected directory tree
onto as many CD-Rs as it takes. For incremental backups,
it needs to select only those files modified after a specified
If you do a total backup, then move a bunch of files, and then
do an incremental backup, are there programs that when doing
a restore will only put the files into their new location
instead of duplicating them into both old and new? This would
require taking a snapshot of the entire directory structure
(file names, sizes, locations, etc.) at every incremental
backup, and then using the latest such snapshot for restoring
the total backup and all the incrementals.
I guess it would be good to have the files in the backups be in
a flat structure -- all in one directory instead of duplicating
the directory structure of the source disk -- and storing the path
of each file in a preamble to it. Like tar does (and DOS backup).
So an error in a backed-up directory couldn't mess up all the
files underneath it.
Tar doesn't do any compression, right?
I guess I could use find to scan the disk and make a list of
the files and their sizes, and feed that to a C or bash program
that would divide the list into CD-R sized sections and
individually tar the files in each one and make them into an
ISO image on a gig (or whatever) of free disk space, and then
burn that onto a CD-R, and verify it against the actual files.
The sectioned list would be saved for restarting an interrupted
backup, and for locating the CD-R containing a single file to
be restored. Wowie, a simple sort of find's output alphabetizes
the list at every level, making it much easier to locate a file!
OK, I may have induced too much groaning among you experts for
the new year already, so I will stop now. Any suggestions?
More information about the svlug