# Cluster beowulf



## cryptdir (Apr 24, 2011)

I need to create a cluster and I wonder if the beowulf cluster type automatically distributes tasks to nodes without having to configure. E.g. an Apache server will automatically distribute processing with us? Or need to install an apache server different?


----------



## vivek (Apr 25, 2011)

Your question is not clear to me. Do you want failover Apache (HA) cluster or compute cluster (parallel computing)? Apache mostly comes under HA failover cluster. It will be behind load balancer (LB). LB can be crated using reverse proxy server such as nginx or any other supported Open Source software. LB need to share and create VIP (virtual IP) using pf+CARP or any other open source software.

Parallel computing is used for simulations, financial modeling, large data set mining and crunching.


----------



## cryptdir (Apr 25, 2011)

*Cluster*

E.g. google should use some clustering software. I have 16 servers and want to make a cluster to have scalability and high availability using 16 servers as a single one. Is there any tool to FreeBSD as OSCAR (http://svn.oscar.openclustergroup.org/trac/oscar)?

If not, what should I use?


----------



## DungeonMaster3000 (Apr 26, 2011)

Have you thought about DrQueue? It's mainly used for rendering farms but works just as well for batch processing.

http://www.drqueue.org/cwebsite/


----------



## vivek (Apr 26, 2011)

You are not providing exact info. But one can use CARP/HAST under FreeBSD.


----------



## SirDice (Apr 26, 2011)

HA and clustering are different techniques with different implementations. What you are probably looking for is a HA solution. Have a look at net/haproxy. Or, as Vivek mentioned, carp(4). But I'm not sure if carp would scale well with 16 machines or if it's the right solution for you.


----------

