# Script for backup of a second disk



## adripillo (May 20, 2013)

Hello, I am looking for a script to make a backup of a second *d*isk that is mounted on my system to one single file on my main disk. I found this for @SirDice:



> Script? How about a oneliner?
> 
> `# tar -C /some/dir -zcvf backup.tgz files`
> 
> This will create a gzipped tar file called backup.tgz of the directory /some/dir/files/.



This worked very well. My question is what I can do to make it incremental and what I can do to make it differential? Thanks again.


----------



## NewGuy (May 20, 2013)

Well, I suppose if you wanted to only back up files which had been altered (or created) in the past week you could do something like `tar -C /some/dir -zcvf backup.tgz `find /path/to/files -mtime -7 -print``. The find command can locate files modified or created in the past N days. In the above example N is 7 days (one week). The find command will then provide a file list to the tar command. You will probably want to make adjustments to the find portion of the command to make sure you are getting relative path names rather than the full path name of your files.


----------



## adripillo (May 21, 2013)

NewGuy said:
			
		

> Well, I suppose if you wanted to only back up files which had been altered (or created) in the past week you could do something like `tar -C /some/dir -zcvf backup.tgz `find /path/to/files -mtime -7 -print``. The find command can locate files modified or created in the past N days. In the above example N is 7 days (one week). The find command will then provide a file list to the tar command. You will probably want to make adjustments to the find portion of the command to make sure you are getting relative path names rather than the full path name of your files.



Let me see if I get it. You are telling me that with this command, instead of backing everything up again, it will find only the files that where added and then add it to the backup file? Is there some way to make it, for example, each week a new file? *L*ike back up all *F*ridays and create for each *F*riday a new file like, backup1, backup2, backup3, and so on?


----------



## kpa (May 21, 2013)

Take a look at the old but still viable tools dump(8) and restore(8). They have full support for incremental backups. 

Also @wblock@ has written a guide for using them:

http://www.wonkity.com/~wblock/docs/html/backup.html


----------



## adripillo (May 21, 2013)

kpa said:
			
		

> Take a look at the old but still viable tools dump(8) and restore(8). They have full support for incremental backups.
> 
> Also @wblock@ has written a guide for using them:
> 
> http://www.wonkity.com/~wblock/docs/html/backup.html



Thanks a lot, seems is what I need, I will take a deep look on that.


----------



## adripillo (May 21, 2013)

After read*ing* the options *I* feel that dump is not for what I am looking for so I found rsync. So I copied some example and modify it for my self. I leave the lines here because I want to know if what I do is right. Going to explain you more what I need to do so all can understand me better. I have a disk UFS mounted on /media/datos and I have a folder on my main HDD where I want to save the incremental back up, /usr/home/USER/bkup. So now the code is this (correct me if it something wrong please):


```
#!/bin/sh

# directory to backup
BDIR=/media/datos

# the name of the backup machine
BSERVER=OWNER

########################################################################

BACKUPDIR=`date +%A`
OPTS="--force --ignore-errors --delete-excluded --exclude-from=$EXCLUDES 
      --delete --backup --backup-dir=/$BACKUPDIR -a"

export PATH=/usr/home/USER/bkup

# the following line clears the last weeks incremental directory
[ -d $HOME/emptydir ] || mkdir $HOME/emptydir
rsync --delete -a $HOME/emptydir/ $BSERVER::$USER/$BACKUPDIR/
rmdir $HOME/emptydir
```


----------



## adripillo (May 21, 2013)

This can be closed, I will open a new post about Rsync so all can see it. Thanks


----------

