[Beowulf] Infiniband modular switches

Joe Landman landman at scalableinformatics.com
Thu Jun 12 07:36:32 PDT 2008

Ramiro Alba Queipo wrote:
> Hello everybody:
> We are about to build an HPC cluster with infiniband network starting
> from 22 dual socket nodes with AMD QUAD core processors and in a year or
> so we will be having about 120 nodes. We will be using infiniband both
> for calculation as for storage.

Hi Ramiro:

   You may experience some contention issues in this case if your code 
is very latency sensitive, and you do lots of IO.

> The question is that we need a modular solution and we are having 3
> candidates:
> a) Voltaire Grid Director SDR or DDR 288 ports (9988 or 2012 models)->
> seems very good and well supported, but very expensive.
> b) Qlogic SilverStorm 9120 (144 ports) -> no price and support
> information yet
> c) Flextronics 10U 144 Port Modular-> very good at price but little
> support => risky option?.

The Flextronics units are Mellanox IP/chips inside (as are, I believe, 
many/most of the others).  That is, the risk is low from a "will it 
work" view.  Flextronics is an ODM, so they may not provide the levels 
of support around the system that you might get with Voltaire et al.

Do you want/need a 1:1 architecture (e.g. all ports are the same number 
of switch hops from each other), or are you able/willing to look into 
oversubscribed links?  Part of this has to do with your traffic 
patterns, your code requirements on latency, and your storage bandwidth.

The Voltaire units are good, we have used them in units for customers. 
No complaints.  Flextronics should be fine, as should Qlogic.  We have 
customers with all of these.  Rarely hear of complaints on IB switches.

> I am in a mess. What is your opinion about this matter? Are you using
> any of this products.
> Regards

Joseph Landman, Ph.D
Founder and CEO
Scalable Informatics LLC,
email: landman at scalableinformatics.com
web  : http://www.scalableinformatics.com
phone: +1 734 786 8423
fax  : +1 866 888 3112
cell : +1 734 612 4615

More information about the Beowulf mailing list