backup2l on Mac OS X

Some months ago, I worked out a satisfactory way of backing up my iBook’s harddisk. The iBook runs Mac OS X “Panther” (version 10.3.8 currently), and since Mac OS X is a UNIX derivative, I thought I’d try backup2l (now at version 1.4). I had read about the tool in Oli’s blog, and in about a day, had it working to my satisfaction.

In theory, I could have used backup2l more or less out of the box. All the UNIX utilities it needs are installed with the BSD subsystem on Mac OS X, e.g. tar, gzip or sed. However, there were two major problems:

  1. When running backup2l with my first test configuration file, there were error messages from utilities like du, complaining about command line options they didn’t understand.
  2. The installed versions of tar and friends cannot safely backup files from the HFS file system used in standard Mac OS installations because they don’t handle resource forks (e.g. a thumbnail for an image file might be stored in the resource fork).

The first problem occurred because backup2l makes use of features that only appear in the GNU versions of the required utilities. The installed versions are not sufficient—this is now mentioned in the README file of the backup2l distribution. Since I make use of the Fink project, I solved the problem by installing the packages sed, findutils and fileutils. It may also be necessary to install the diffutils package, which I already had installed.

The second problem meant that I had to find a suitable command line archiver that could handle HFS, and that I’d have to write my own “driver” for backup2l. I found four candidate archivers:

  1. ditto, which is written by Apple and included in Mac OS X. I’d like to have used it, because the manufacturer would probably know best how to handle HFS correctly. However, although it has a mode that allows you to create archives in cpio format, you cannot pass it a list of files to archive in that mode, which is a prerequisite for using it with backup2l. This is listed as a known bug in the ditto man page, so it may be fixed in the future.
  2. hfstar is a patched version of GNU tar that can handle HFS. I tried this first, because it meant I only had to make a slightly modified version of the standard tar driver in the backup2l script. Everything worked fine until the point where backup2l compares the list of files that should have been archived with the list of files actually in the archive. I saw at that point that hfstar had cut off some long pathnames, which of course made it unusable for my purpose. This may be a limitation of GNU tar—I didn’t investigate further and gave up trying to make this work.
  3. HELIOS Xtar is another version of tar that can handle HFS. After my experience with hfstar, I didn’t even try this, because I assumed it would have the same problem. I may be wrong, however.
  4. hfspax is a hacked, HFS-enabled version of pax, an archiver that also comes with Mac OS X. This is what I ended up using.

I then wrote the driver and added it to the appropriate configuration variables:



    case $1 in
            require_tools hfspax
            echo "ok"
            echo "pax"
            hfspax -w -d -f $3 -x cpio < $4 2>&1
            hfspax -f $3 | grep -v "\.\.namedfork"
            sed 's#^#/#' $4 | xargs -i -r hfspax -r -pe -f $3 -s#^/## "{}" 2>&1

Don’t ask about what this all means—I’ve half forgotten it myself by now. It works for me … Also, please note that this doesn’t use compression.

So, to summarize: If you want to use this, install backup2l and make an appropriate backup2l.conf using the above driver. I’ve been backing up my harddisk (to my iPod, before that to a Linux partition mounted via NFS) for some months now without problems, but I can’t guarantee that it will work for you. So, handle with care, and make sure restoring works as well.

And one quirk: the date output in the backup summary only works if you also install a GNU version of date. This isn’t provided in the Fink stable distribution, so I didn’t bother.

Trackbacks are closed for this story.

Comments are closed for this story.