[Beowulf] New member, upgrading our existing Beowulf cluster

Peter Kjellstrom cap at nsc.liu.se
Wed Dec 2 12:28:07 PST 2009


On Wednesday 02 December 2009, Hearns, John wrote:
> I'm a new member to this list, but the research group that I work for has
> had a working cluster for many years. I am now looking at upgrading our
> current configuration.
...
> Mixing modern multi-core hardware with an older OS release which worked
> with those old disk drivers and Ethernet drivers will  be a nightmare.

But why run an older OS release? Something like CentOS-5.latest will run fine 
on your new hardware and it's no problem getting all sorts of old HPC code to 
run on it (disclaimer: of course you can find a zillion apps that break on 
any given OS...).

> I was wondering if anyone has actual experience with running more than one
> node from a single power supply.
...
> Look at the Supermicro twin systems, they have two motherboards in 1U or
> four motherboards in 2U.
>
> I believe HP have similar.

They have 4-nodes in 2U (it has the added benefint of using large 8cm fans 
instead of those inefficient 1U fans...). Supermicro also has a 4-nodes in 
2U.

> Or of course any of the blade chassis – Supermicro, HP, Sun and dare I say
> it SGI.

We've typically found that blade chassi type hardware is far from cost 
effective for HPC, but YMMV.

> On a smaller scale you could look at the ‘personal supercomputers’ from
> Cray and SGI.

Even less cost effective (I think).

> The contents of this email are confidential and for the exclusive use of
> the intended recipient...

Good job sending it to a public e-mail list then.

> If you receive this email in error you should not 
> copy it, retransmit it, use it or disclose its contents but should return
> it to the sender immediately and delete your copy.

/Peter
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 189 bytes
Desc: This is a digitally signed message part.
URL: <http://www.beowulf.org/pipermail/beowulf/attachments/20091202/a39cf63d/attachment.sig>


More information about the Beowulf mailing list