[Beowulf] HPC demo

Benson Muite benson_muite at emailplus.org
Mon Jan 20 20:02:02 PST 2020


a) For technically knowledgable audience, you could demonstrate some 
simple benchmarks codes.

b) For a more general audience, you might also look for some parallel 
open source applications that are specific to the domain of interest. 
For example for engineering, OpenFOAM (https://openfoam.com/) has some 
demo setups that can give good motivation. FENICS 
(https://fenicsproject.org/) and DEAL.II (https://www.dealii.org/) also 
have some things. If Oil and Gas industry SpecFEM3D 
(https://geodynamics.org/cig/software/specfem3d/). If computational 
chemistry NWChem (http://www.nwchem-sw.org/index.php/Main_Page) might be 
nice. There are of course many other codes as well. If the cluster is 
mostly used as a task farm to run many single node jobs rather than for 
a large parallel application, some embarrassingly parallel task is fine 
(https://en.wikipedia.org/wiki/Embarrassingly_parallel), ideally setup 
so that upper management can execute it. One such application is Mitsuba 
(https://www.mitsuba-renderer.org/), some measurements for Mitsuba on a 
small 2 node setup can be found at 
https://courses.cs.ut.ee/MTAT.08.037/2015_spring/uploads/Main/Martoja.pdf

Probably most useful is to find out users of the cluster and have one or 
two of them explain how you have/will help improve company profitability 
by making their workflow more effective.

On 1/14/20 10:54 PM, Scott Atchley wrote:
> Yes, we have built a few of them. We have one here, one at AMSE, and 
> one that travels to schools in one of our traveling science trailers.
>
> On Tue, Jan 14, 2020 at 10:29 AM John McCulloch <johnm at pcpcdirect.com 
> <mailto:johnm at pcpcdirect.com>> wrote:
>
>     Hey Scott, I think I saw an exhibit like what you’re describing at
>     the AMSE when I was on a project in Oak Ridge. Was that it?
>
>     John McCulloch | PCPC Direct, Ltd. | desk 713-344-0923
>
>     *From:* Scott Atchley <e.scott.atchley at gmail.com
>     <mailto:e.scott.atchley at gmail.com>>
>     *Sent:* Tuesday, January 14, 2020 7:19 AM
>     *To:* John McCulloch <johnm at pcpcdirect.com
>     <mailto:johnm at pcpcdirect.com>>
>     *Cc:* beowulf at beowulf.org <mailto:beowulf at beowulf.org>
>     *Subject:* Re: [Beowulf] HPC demo
>
>     We still have Tiny Titan <https://tinytitan.github.io> even though
>     Titan is gone. It allows users to toggle processors on and off and
>     the display has a mode where the "water" is colored coded by the
>     processor, which has a corresponding light. You can see the frame
>     rate go up as you add processors and the motion becomes much more
>     fluid.
>
>     On Mon, Jan 13, 2020 at 7:35 PM John McCulloch
>     <johnm at pcpcdirect.com <mailto:johnm at pcpcdirect.com>> wrote:
>
>         I recently inherited management of a cluster and my knowledge
>         is limited to a bit of Red Hat. I need to figure out a demo
>         for upper management graphically demonstrating the speed up of
>         running a parallel app on one x86 node versus multiple nodes
>         up to 36. They have dual Gold 6132 procs and Mellanox EDR
>         interconnect. Any suggestions would be appreciated.
>
>         Respectfully,
>
>         John McCulloch | PCPC Direct, Ltd.
>
>         _______________________________________________
>         Beowulf mailing list, Beowulf at beowulf.org
>         <mailto:Beowulf at beowulf.org> sponsored by Penguin Computing
>         To change your subscription (digest mode or unsubscribe) visit
>         https://beowulf.org/cgi-bin/mailman/listinfo/beowulf
>
>
> _______________________________________________
> Beowulf mailing list, Beowulf at beowulf.org sponsored by Penguin Computing
> To change your subscription (digest mode or unsubscribe) visit https://beowulf.org/cgi-bin/mailman/listinfo/beowulf
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://beowulf.org/pipermail/beowulf/attachments/20200121/4b770a80/attachment.html>


More information about the Beowulf mailing list