# A good little recursive FTP program for files 'backup'.



## PacketMan (Jan 3, 2016)

I have a head-less NAS that has all my files on it.  I use a set of (different) NAS drives I got from a typical big box store, that I rotate off-site. FTP is really my only option; I cannot install any 'apps' on the NAS drive.

I am looking for a port to install on my head-less NAS that:

I can configure via configuration 'text' file. (Or via terminal screen could be fine, but I prefer configuration file).

Launch via a simple command, or schedule via cron job.
Will recursively walk a folder and FTP copy (put) any and all files and/or folders in that parent folder.
Does not have to use any form of TARing. I currently want all files copied over individually. But TARing option could be useful someday.
Maybe a nice little logging option would be useful.

Are there some 'good little' ports that anyone can recommend I try?


----------



## leebrown66 (Jan 3, 2016)

ftp/lftp or ftp/wput look promising.  Having said that I've used neither.


----------



## PacketMan (Jan 3, 2016)

Thanks a bunch.  I'll probably give that a try soon, but in the mean time I was learning some CLI options for TAR and FTP.  What is probably old school for most of you is still a learning experience for me.  But I got this format working:

`tar -cf file_name_goes_here.tar /nas_folder/pick_your_folder(s)/*`
`ftp -u ftp://username_goes_here:password_goes_here@IP_address_goes_here/Backups/TARs/ *`

So now I will tinker with CRON, and will try to write my 1st script soon.  I don't have enough space on my NAS drive to tar everything at once, and I don't want to buy a 2nd drive.  So I will have to do one tar and ftp at a time, delete, and repeat.


----------



## shepper (Jan 4, 2016)

Another option: ftp/wget will perform recursive file transfers.  The mirror option, further down on the linked man page, might also be of use.


----------



## SirDice (Jan 4, 2016)

You can tar(1) directly to a remote host if the remote allows ssh(1) access (I suggest creating a specific backup account with keys). 

`tar -C /base/dir -cf - somefile | ssh backup@server1 tar -C /backup/storage/ -xvf -`

This will backup /base/dir/somefile to a remote directory /backup/storage/ on server1.

You can also _pull_ (the above example _pushes_ the data) the data across ssh(1):
`ssh backup@myserver tar -C /base/dir -cf - somefile | tar -C /backup/storage/ -xvf -`


----------



## PacketMan (Oct 29, 2016)

Thanks again SirDice, this works really well.  Just out of curiosity is it possible to use this format to tar all the files in a folder tree to a file.tar and copy that file.tar over to the target server via ssh.  I've tried all sort of combinations but none of them work. My target server is a SeaGate device so it might be my limiting factor.

Example of one of many attempts:
`tar -cf files.tar -C /media - pictures | ssh packetman@server tar -C /backup/storage/ -vf -`


----------



## jtotheh (Oct 30, 2016)

you should probably check the "scp" command that copies files over ssh from one machine to another. If you can ssh to the machine you can scp to it as well.


----------

