# tar zfc has problems with too large files



## bsus (Jul 20, 2011)

Hi,
I have written a backup script, which creates a tar.gz of a dir.
The dir contains different file types mostly about 10MB all together we have about 60GiB of data.

Now tar should tape this together and compress it then with gzip. The archive gets the date as name and is now parked for about 30 days in an extra backup dir.

So much to theory. The script works properly with GNU/Linux but under FreeBSD there many issues. 

I have found now that the issue is a tar issue, the rest of the script works good and the script is even listet in the cron logs.


There was until now only one time where the script worked proberly but until now I couldn't locate any new backups.
When I execute the script manuelly I see how tar is already working but at the end something everytime makes tar to fail.
I know this because the script ends with:

```
tar zcvf ${BACKUPDIR}/${DATUM}.tar.gz ${SOURCE} && halt -p
```
And the server is always on when I come back....

Sry for the awful english, Regards


----------



## SirDice (Jul 20, 2011)

Please post the full error you're getting.


----------



## bsus (Jul 20, 2011)

This is the problem
I can't post an error directly. I just can say what doesn't work.
The server has to be available from 8 - 24h when I now start the backup via ssh over night, I come in the morning back to the ssh terminal which lost the connection after some hours..
Because I lose the ssh connection I am never able to see whats going on.

Regards


----------



## SirDice (Jul 20, 2011)

Redirect the output to a file so you can read it later on.


----------



## bsus (Jul 20, 2011)

Good idea.

Can I do this with 
	
	



```
tar czf /media/backup/${DATE} /media/data > output
```


----------



## SirDice (Jul 20, 2011)

Yes, but you might want to redirect STDERR too, not just STDOUT.


----------



## bsus (Jul 20, 2011)

> Yes, but you might want to redirect STDERR too, not just STDOUT.


What do you mean in detail?

Regards


----------



## SirDice (Jul 20, 2011)

This stuff should be second nature...

http://en.wikipedia.org/wiki/Redire...recting_to_and_from_the_standard_file_handles


----------



## wblock@ (Jul 20, 2011)

There's also script(1).  But I'd also suggest that good practice would put the tar command option first and the file option last: tar cvzf ....


----------



## bsus (Jul 20, 2011)

@SirDice
Ah ok, this was new to me in relation to (ba)sh.
So when I have understood this right I should use

```
tar cvzf ${BACKUPDIR} ${SOURCE} 2>&1 output
```

@wblock
ok, changed

Regards


----------



## SirDice (Jul 20, 2011)

bsus said:
			
		

> So when I have understood this right I should use
> 
> ```
> tar cvzf ${BACKUPDIR} ${SOURCE} 2>&1 output
> ```


This will produce an error.

[cmd=]tar -zcvf ${BACKUPDIR} ${SOURCE} > output 2>&1[/cmd]

Keep in mind this only works with (ba)sh. C-shells use a different way.


----------



## phoenix (Jul 20, 2011)

bsus said:
			
		

> This is the problem
> I can't post an error directly. I just can say what doesn't work.
> The server has to be available from 8 - 24h when I now start the backup via ssh over night, I come in the morning back to the ssh terminal which lost the connection after some hours..
> Because I lose the ssh connection I am never able to see whats going on.



You just described the main reason for the existence of terminal multiplexers like sysutils/screen and sysutils/tmux

Here's what you do:

install sysutils/tmux on the remote server
connect to remote server via ssh
start tmux()
start the backup process

Now, if you lose the SSH connection, it doesn't matter, the backups will still be running in the tmux session.  Just reconnect to the server via SSH, and reconnect to the tmux session:
`$ tmux attach`

It will be like you never lost the connection.    You can even scroll through the tmux screen buffer using *CTRL+B, [* (that's the left square bracket) then the cursor keys/page up/down keys (hit ESC to break out of scroll mode).


----------



## bsus (Jul 20, 2011)

@phoenix
thanks for the good tipp 

So, I ran the backup script threw.
First of on the backup should have about 30GB space.

```
ls -l /media/backup   
total 30371384
-rw-r--r--  1 root  wheel   3095080960 Jul 14 23:43 14-07-2011.tar.gz
-rw-r--r--  1 root  wheel  27982143697 Jul 20 21:12 20-07-2011.tar.gz
```
Is this realistic?


The script itself ended with

```
tar: output: Cannot stat: No such file or directory
tar: Error exit delayed from previous errors.
```


----------



## jalla (Jul 21, 2011)

bsus said:
			
		

> The script itself ended with
> 
> ```
> tar: output: Cannot stat: No such file or directory
> ...



You've missed the redirect (>).
tar tries to archive the file output which doesn't exist.


----------

