# backup a big partition using cpio



## ccc (Aug 11, 2010)

hi

Howto backup a 40GB /usr partition from the old freeBSD 4.6 system using *cpio*?
cpio cannot deal with files > 2GB
Is it any way to split the output to many backup files?


----------



## jem (Aug 11, 2010)

dump(8) to gzip to a file


----------



## ccc (Aug 11, 2010)

Thx, but I'm wondering, is it any way using cpio?


----------



## graudeejs (Aug 11, 2010)

Keep it simple use tar, dump/restore, cp

[You can split/join files with dd, but you can do lots of damage as well, if you aren't careful. _But I didn't tell you that_]


----------



## Beastie (Aug 11, 2010)

ccc said:
			
		

> Is it any way to split the output to many backup files?


Pipe the output to split(1)'s standard input (i.e. the "dash").


----------



## ccc (Aug 11, 2010)

Thx, but ist it correct this command to backup the whole /usr partition and split into 1GB files?
	
	



```
# cd / ; find ./usr -depth -print -xdev | cpio -ovfB | gzip -c | split -b 1000m > /mnt/backup_usr.gz &
```


----------



## graudeejs (Aug 11, 2010)

NO, use dump/restore

http://forums.freebsd.org/showthread.php?t=185


----------



## Beastie (Aug 11, 2010)

ccc said:
			
		

> Thx, but ist it correct this command to backup the whole /usr partition and split into 1GB files?
> 
> 
> 
> ...


Do not use the final *> /mnt/backup_usr.gz*. Use *- /mnt/backup_usr.gz_* instead. And split(1) supports gigabytes so *-b 1g* should work. The underscore is used for more clarity.

`# cd / ; find ./usr -depth -print -xdev | cpio -ovfB | gzip -c | split -b 1g - /mnt/backup_usr.gz_`


But as already mentioned dump/restore may be easier (and faster).


----------



## ccc (Aug 12, 2010)

Thx a lot, it works well now!


----------

