[Beowulf] 512 nodes Myrinet cluster Challanges
John Hearns
john.hearns at streamline-computing.com
Fri May 5 02:08:47 PDT 2006
On Fri, 2006-05-05 at 10:23 +0200, Alan Louis Scheinine wrote:
> Since you'all are talking about IPMI, I have a question.
> The newer Tyan boards have a plug-in IPMI 2.0 that uses
> one of the two Gigabit Ethernet channels for the Ethernet
> connection to IPMI. If I use channel bonding (trunking) of the
> two GbE channels, can I still communicate with IPMI on Ethernet?
We recently put in a cluster with bonded gigabit, however that was done
using a separate dual-port PCI card.
On Supermicro, the IPMI card by default uses the same MAC address as the
eth0 port which it shares. You could reconfigure this I think.
(lan set 1 maccaddr <x:x:x:x:x:x>
Also Supermicro have a riser card00:30:48:2d:49:44
which provides a separate network and serial port for the IPMI card.
Tyan probably have similar.
More information about the Beowulf
mailing list