[Beowulf] standards for GFLOPS / power consumption measurement?

Gerry Creager N5JXS gerry.creager at tamu.edu
Tue May 10 07:34:59 PDT 2005


Hmmm.  Seems the Stoned Souper Computer at ORNL was a zero-cash-outlay 
system, so it was either infinite or indeterminate for the GFLOPS/$ 
ratio.  Of course, this is historical, so some here may not recall that 
effort.

gerry

Vincent Diepeveen wrote:
> How do you categorize second hand bought systems?
> 
> I bought for 325 euro a third dual k7 mainboard + 2 processors. 
> 
> The rest i removed from old machines that get thrown away otherwise. 
> Like 8GB harddisk. Amazingly biggest problem was getting a case to reduce
> sound production :)
> 
> Network cards i got for free, very nice gesture from someone.
> 
> So when speaking of gflops per dollar at linpack, this will destroy of
> course any record of $2500 currently, especially for applications needing
> bandwidth to other processors, if i see what i paid for this self
> constructed beowulf.
> 
> At 05:19 PM 5/9/2005 -0400, Douglas Eadline - ClusterWorld Magazine wrote:
> 
>>On Thu, 5 May 2005, Ted Matsumura wrote:
>>
>>
>>>I've noted that the orionmulti web site specifies 230 Gflops peak, 110 
>>>sustained, ~48% of peak with Linpack which works out to ~$909 / Gflop ?
>>> The Clusterworld value box with 8 Sempron 2500s specifies a peak Gflops
> 
> by 
> 
>>>measuring CPU Ghz x 2 (1 - FADD, 1 FMUL), and comes out with a rating of
> 
> 52% 
> 
>>>of peak using HPL @ ~ $140/Gflop (sustained?)
>>
>>It is hard to compare. I don't know what sustained or peak means in the
>>context of their tests. There is the actual number (which I assume is
>>sustained) then the theoretical peak (which I assume is peak).
>>
>>And our cost/Gflop does not take into consideration the construction 
>>cost. In my opinion when reporting these type of numbers, there
>>should be two categories "DIY/self assembled" and "turn-key". Clearly
>>Kronos is DIY system and will always have an advantage of a 
>>turnkey system.
>>
>>
>>
>>> So what would the orionmulti measure out with HPL? What would the 
>>>Clusterworld value box measure out with Linpack?
>>
>>Other benchmarks are here (including some NAS runs):
>>
>>http://www.clusterworld.com/kronos/bps-logs/
> 
>                                                  
> 
>      
> 
>>> Another line item spec I don't get is rocketcalc's ( 
>>>http://www.rocketcalc.com/saturn_he.pdf )"Max Average Load" ?? What does 
>>>this mean?? How do I replicate "Max Average Load" on other systems??
>>> I'm curious if one couldn't slightly up the budget for the clusterworld
> 
> box 
> 
>>>to use higher speed procs or maybe dual procs per node and see some 
>>>interesting value with regards to low $$/Gflop?? Also, the clusterworld
> 
> box 
> 
>>>doesn't include the cost of the "found" utility rack, but does include the 
>>>cost of the plastic node boxes. What's up with that??
>>
>>This was explained in the article. We assumed that shelving was optional 
>>because others my wish to just put the cluster on existing shelves or 
>>table top (or with enough Velcro strips and wire ties build a standalone 
>>cube!)
>>
>>Doug
>>
>>----------------------------------------------------------------
>>Editor-in-chief                   ClusterWorld Magazine
>>Desk: 610.865.6061                            
>>Cell: 610.390.7765         Redefining High Performance Computing
>>Fax:  610.865.6618                www.clusterworld.com
>>
>>_______________________________________________
>>Beowulf mailing list, Beowulf at beowulf.org
>>To change your subscription (digest mode or unsubscribe) visit
> 
> http://www.beowulf.org/mailman/listinfo/beowulf
> 
>>
> _______________________________________________
> Beowulf mailing list, Beowulf at beowulf.org
> To change your subscription (digest mode or unsubscribe) visit http://www.beowulf.org/mailman/listinfo/beowulf

-- 
Gerry Creager -- gerry.creager at tamu.edu
Texas Mesonet -- AATLT, Texas A&M University	
Cell: 979.229.5301 Office: 979.458.4020 FAX: 979.847.8578
Page: 979.228.0173
Office: 903A Eller Bldg, TAMU, College Station, TX 77843



More information about the Beowulf mailing list