Top 500 trends
Brian D. Ropers-Huilman
bropers at lsu.edu
Mon Nov 25 20:16:45 PST 2002
On Mon, 25 Nov 2002 Dr. Joseph Landman <landman at scalableinformatics.com> wrote:
> On Mon, 2002-11-25 at 18:20, Ken Chase <math at velocet.ca> wrote:
>
> > Does anyone care? ("no.") Shouldnt we? Isnt that what Beowulf (but
> > HPC itself, obviously) is about?
>
> Building the fastest machine when you have infinite budget is a neat
> exercise, it just isn't terribly relevant to most people. I look at
> those machines as usually non-commercially viable one-offs.
While I do not have budget numbers to share with the group, I can say
the following about our currently 17th ranked system:
* We already had a relatively new machine room (only six years old), so
most infrastructure was in place
* We did have to add a new cooling unit (huge) and the cluster is
running off of it's own circuit, directly from the physical plant
* The vendor / integrator spent a full month, plus several other
on-site visits to get our numbers as high as they were
* We have one full-time sys admin (OS and infrastructure only) and on
full-time Ph.D. for user support (code porting / writing and
applications help)
* Most of our funding came directly from the state (governor granted a
large IT infusion to the state)
I have no idea what type of warranty or support we have on the system.
As far as personnel, the system is supposed to be "integrated" to our
existing SP and RS/6000 cluster for "consistancy." My side is run with
three full-time people (including myself) and we manage over 100
production systems (campus web, e-mail, application servers, state-wide
library, athletic ticket sales, business school's national SAP system,
backup system, etc.)
--
Brian D. Ropers-Huilman (225) 578-0461 (V)
Systems Administrator (225) 578-6400 (F)
Office of Computing Services brian at ropers-huilman.net
High Performance Computing http://www.ropers-huilman.net/
Fred Frey Building, Rm. 201, E-1Q \o/
Louisiana State University -- __o / |
Baton Rouge, LA 70803-1900 --- `\<, / `\\,
O/ O / O/ O
More information about the Beowulf
mailing list