Anyone have information on latest LSU beowulf?

Brian D. Ropers-Huilman bropers at lsu.edu
Fri Oct 4 11:53:53 PDT 2002


The cluster is "ours" (or at least we are soon to be officially involved 
with it). Rocky, who mentions the Intel site below, is from Atipa, the 
systems integrator who constructed the machine.

I believe the initial LINPACK results (up to 2.0 TF) were achieved by 
Atipa, but that our current systems administrator managed to eek out the 
extra .2 TF for our 2.2 TF rating.

I don't know any details of the compilation or parameter tuning, but I 
can get that information for you.

If you want any other specifications, I can provide those as well.

On Fri, 4 Oct 2002 beowulf-admin at beowulf.org wrote:
> On Mon, 23 Sep 2002, Craig Tierney wrote:
> 
> > Does anyone have any information or contacts
> > at LSU about their new system?  It is 512 dual
> > Xeon 1.8 Ghz connected with Myrinet.
> > 
> > See:
> > 
> > http://www.phys.lsu.edu/faculty/tohline/capital/beowulf.html
> > 
> > Their HPL result is 2.2 Tflops!  Very impressive.
> > 
> > I wanted to find out more about how they configured their
> > system.  What are they using for a batch system?  How are
> > they doing their IO?   What HPL settings did they use to
> > achieve their result?
> > 
> > Thanks,
> > Craig
> > 
> 
> http://www.intel.com/ebusiness/pdf/affiliates/LSU0240.pdf
> 
> They're using PBSPro for the batch system.
> 
> For I/O, they are connected to the campus SAN through 8 nodes, and also 
> have 4 I/O nodes with decently sized RAIDs running PVFS.

-- 
Brian D. Ropers-Huilman                   (225) 578-0461 (V)
Systems Administrator                     (225) 578-6400 (F)
Office of Computing Services              brian at ropers-huilman.net
High Performance Computing                http://www.ropers-huilman.net/
Fred Frey Building, Rm. 201, E-1Q                             \o/
Louisiana State University                      --  __o   /    |
Baton Rouge, LA 70803-1900                     --- `\<,  /    `\\,
                                                   O/ O /     O/ O




More information about the Beowulf mailing list