Building a beowulf with old computers

Robert G. Brown rgb at phy.duke.edu
Sun Mar 9 14:56:48 PST 2003


On 9 Mar 2003, D. Scott wrote:

> Hi Beowulf experts
> 
> I've got old computer that have 32MB RAM some with 1GB hard disk and 512MB 
> hard disk and some with 16MB RAM. How best to build a beowulf? I've looked 
> into OSCAR but that requires large about of disk & memory. SCE & SMA also 
> require min 256MB RAM. Have anyone built a beowulf with low spec PCs?
> 
> 
> Thanks in advance

You can build a beowulf or other cluster with nearly anything if you are
willing to work hard enough.  However, you can't build a beowulf and WIN
in terms of work done with antique equipment.  For example, run the
following numbers.  Assume that those old 200 MHz Pentium computers
(presuming that this is what they are) draw a whopping 50 Watts each.

It costs roughly $1 per watt per year to run a computer 24x7, so each of
these computers will cost roughly $50/year just to leave turned on and
to air condition the room in which they sit.

Now let's assume that you have 16 of them.  If you buy them a network, a
memory upgrade (you'll REALLY have a hard time building a cluster if you
don't bump memory to at LEAST 64-128 MB) so you can run them diskless or
nearly so, and a shelf and some surge protectors, you can probably
assemble the physical cluster for around $400.

It will have 3200 aggregate MHz of Pentium class CPU clock, and very
slow memory.

Now a Pentium loses more than a factor of 2 to any P6-family CPU in
instructions per clock, especially floating point instructions.  You
therefore have a maximum of 1600 aggregate MHz in 16 systems that will
cost you $400 and a LOT of time to set up, and will cost you $800 to run
for their first year.

The CHEAPEST computer you can buy now is probably a 1700 MHz Celeron,
and a stripped Celeron (256 MB of memory, a small disk, cheapest video
and so forth) can be had for around $500 in today's depressed market.
It will draw more electricity, no more than perhaps 100 Watts sustained.
For $600 you can get MORE work done with the single Celeron than you can
for $1200 invested in your "free" nodes, and even if you spend NOTHING
fixing them up, you lose.

And then there is Amdahl's law, which basically means that you lose even
bigger than you'd expect on the basis of aggregate clock and per CPU
throughput for anything but embarrassingly parallel tasks.

The sad truth is that cluster nodes have an ECONOMICALLY useful lifetime
of somewhere between 18 months and 3 years, depending on lots of things,
although one can arguably get work done out to 5 years on nodes that
require no human time to run or repair that other people are paying to
feed and cool.

Even for learning purposes or "fun", your nodes are bit long in the
tooth (unless they are somehow 300-400 MHz P6 family CPUs with so little
memory).  I'd leave them lay, save your money, and see if you can't come
up with $2500 or so, which is enough to get a very niced little learning
cluster either new or with relatively new used hardware.

    rgb

-- 
Robert G. Brown	                       http://www.phy.duke.edu/~rgb/
Duke University Dept. of Physics, Box 90305
Durham, N.C. 27708-0305
Phone: 1-919-660-2567  Fax: 919-660-2525     email:rgb at phy.duke.edu






More information about the Beowulf mailing list