# file backup - currently using pax



## soncorn (Aug 26, 2011)

I have an apache web server on FreeBSD that I use to host a Gallery 2 image gallery.  This is used to share photos with friends and family.

I have a lot of pictures that are stored on this computer.

I am not interested in backing up the whole system.  It is relatively trivial for me to do a clean install of a new system to support this "hosting".

I currently backup the images using pax.

The files are backed up to external USB hard disk drives that are also used to backup/synchronize MS Windows files via Allway Sync.  The file system on this drive is FAT32 for compatibility purposes across multiple platforms.

When I am archiving the images I am limited to archive file sizes of <2GB due to FAT32.

I am open to changing file system format as long as I can continue to use the drive for FreeBSD, OpenBSD and Windows 7.  The last time I checked NTFS support was still poor on the BSDs and Unix-like operating systems.

So I currently backup the files using pax to an archive spanning multiple <2GB files.

The command I use is:

```
pax -w -v -B 2147483648 -f archive01.pax /usr/xxx/yyy/zzz/images
```


I cannot run this and forget it, I have to interact with it every time an archive reaches the ~2GB limit and enter the name of the next archive, archive02.pax, archive03.pax, archive04.pax, ...

Currently I am up to 27 ~2GB files or around 51GB of files.

It there a way to automate this process of incrementing the file names?

Maybe some switch I am missing with pax, or a bash script or other script which someone could point me at some web pages to help me write a script?

This is nothing more critical than trying to back up some import digital photos, but I can see where this might be handy for others that might want to try and backup large amounts of files onto less capable file systems.

Thank you in advance for any guidance.


----------



## wblock@ (Aug 27, 2011)

The general approach is to not write directly to a file, but pipe the archiver output to gzip(1) and then split(1) it into multiple files of 2000M or less.  Clonezilla does that, for instance.  On restore, cat(1) the split files together and gunzip(1) them into the archiver program.


----------

