[Beowulf] Advice on 4 CPU node configurations

Prentice Bisbal prentice at ias.edu
Wed Feb 9 12:58:32 PST 2011



xingqiu yuan wrote:
> 
> 
> On Wed, Feb 9, 2011 at 2:37 PM, Prentice Bisbal <prentice at ias.edu
> <mailto:prentice at ias.edu>> wrote:
> 
>     Craig Tierney wrote:
>     > On 2/9/11 7:38 AM, Prentice Bisbal wrote:
>     >> I don't think Numascale/ScaleMP has much of a cost advantage anymore.
>     >>
>     >> About 6 months ago, I purchased a couple of Dell PowerEdge R815s with
>     >> 128 GB of RAM and 32 cores. We looked at similar RAM configurations a
>     >> couple of years ago. and the cost premium for that much RAM was
>     >> prohibitive (the price for additional RAM seemed to go up
>     exponentially
>     >> with the amount of RAM) so we stayed at 32 GB.  This time around, the
>     >> premium for the additional RAM seemed marginal. Not sure how
>     large you
>     >> can go and keep that "marginal" relationship, but it's much
>     larger than
>     >> it was a couple of years ago.
>     >>
>     >
>     > There is only so much memory you can put in a box, regardless of the
>     > cost.  ScaleMP/Numascale let you get around that issue.  What if I
>     need
>     > a TB of RAM?  Yes, I might be able to convert my algorithms, but
>     if I can
>     > add $3k per node to get cache-coherent memory, why not go that route?
> 
>     >Agreed. I was going to add the caveat that there are limits to how much
>     >RAM you can add to a single box, or there eventually be a point the
>     >amount of RAM in a single box vs. price is no longer nonlinear, but
>     then
>     >I got lazy. ;)
> 
> 
> 
> Don't understand why you need TB of RAM? Don't forget the fact that $3K
> can almost buy one-node!!! 
> 

Certain problems require more RAM than others. I worked with an
astrophysicist whose work involved modelling the motion of planets. He
didn't even need RAM for his work - all of the data would fit in the
processor's cache.

On the other hand, genomics research is very data intensive, and can
involve searching for one really long string in millions (billions?) of
other long strings. This will require much more memory. If you can fit
all those other strings into RAM, the pattern searching will go much
faster.

-- 
Prentice



More information about the Beowulf mailing list