# synchronously update 3 or more freebsd



## stasmus (Dec 19, 2008)

I have 3 freebsd copy, installed from image (ex http://snowpenza.ru ). difference is only in password config files (ex. php.ini apache.conf )
How i can update all copys synchronously (ex. portupgrade -ai)?. What soft i need to use?


----------



## brd@ (Dec 19, 2008)

Well you could use portupgrade to upgrade one of them, and then rsync everything from that one to the others, just use the --exclude option to exclude things like /etc and /usr/local/etc..


----------



## stasmus (Dec 24, 2008)

brd@ said:
			
		

> Well you could use portupgrade to upgrade one of them, and then rsync everything from that one to the others, just use the --exclude option to exclude things like /etc and /usr/local/etc..



nice  
i think it is not good idea.
maybe exist multi ssh or other methods?


----------



## Ole (Dec 24, 2008)

IMHO some soft from ports can help:

```
ports]# make search key="for executing commands" |egrep "(^Path|^Info)"
Path:   /usr/ports/sysutils/rubygem-capistrano
Info:   A utility for executing commands in parallel on multiple machines
Path:   /usr/ports/sysutils/tentakel
Info:   A program for executing commands on many hosts in parallel
```

or simple make shell-scripts 

```
#!/bin/sh
RCMD="sudo whoami ; \
uname -r; \
hostname; \
"

HOSTS="host1 host2 host3"

for HST in ${HOSTS}; do
T=`ssh user@${HST} -C "${RCMD}"`
echo ${T}
done
```

with ssh-key auth.

When architecures on the host is equally, you can make "portupgrade -L logfile" on the host1, and after that analyse log file for searching update-package execute 

```
pkg_create -b <package>
```
and
tranfer .tbz file to other host for extract.


----------



## danger@ (Dec 24, 2008)

you can build your own packages on one of your 3 machines, then export /usr/ports with NFS to the other ones and use portupgrade -PP to upgrade the other two...


----------



## vermaden (Dec 24, 2008)

You can use Cluster SSH to update as many hosts as you want:
http://freshmeat.net/projects/clusterssh/


----------



## Pushrod (Dec 24, 2008)

I don't see what's wrong with rsync. It's straightforward and probably the least likely to screw up. You could even do things like rsync to one other machine, and then use the original machine and the one updated machine to update the remaining machines.


----------



## Ole (Dec 25, 2008)

Pushrod said:
			
		

> I don't see what's wrong with rsync.



IMHO, possible to have rarely situation when rsync transfer and overwrite file on the target when file is execute  
( for example when i doing "make installworld" in not single mode and on the other parallel session doing something that often execute external file ,i get "core dumped or Segmentation fault" when "make install" job overwriting /lib/libc.so* - most common library )

I suggest experiment. not for production servers  like something:
1) copy /lib to /lib-orig and /usr/lib /usr/lib-orig
2) make and execute shell-script with infinity loop of "rsync /lib-orig/* /lib/*"
3) watch by tail -f /var/log/messages for crashed processes 

With rsync key "--delete-after" chance for bad behaviour is smaller but without locking anyway is not good. or not ?


----------



## MG (Dec 28, 2008)

I don't see any problem.
I would probably just ssh to those computers one by one and do the update.
Of course you can set something like a cron job and have your bios-clocks synced with a time server.
Or put al three machines in an endless loop while waiting for "DO IT NOW!!!" to appear on a webpage. This way you only have to put a file on some web-location to initiate the updates.


----------

