<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
<HTML><HEAD>
<META content="text/html; charset=iso-8859-1" http-equiv=Content-Type>
<META content="MSHTML 5.00.2920.0" name=GENERATOR>
<STYLE></STYLE>
</HEAD>
<BODY bgColor=#c0c0c0>
<DIV><FONT face=Arial size=2>I am currently in the process of constructing a 4
node beowulf for MD<BR>simulations. Each node will be equiped with an
Alpha 5XX (21164) and<BR>approximately 576mb ram each. (more nodes
may be introduced if an increase in speed is needed.)<BR><BR>Curious about
speed differences between unix and linux, I tested two<BR>different machines
with the same hardware setup. One of the machines was<BR>running digital's
unix, and the other linux (distro=redhat 6.2).<BR>Additionally, the kernel on
the linux machine was built using only the<BR>drivers needed.<BR><BR>To my
surprise, the duration for all runs tested were approximately 3 times<BR>longer
on the linux machine. Do I need to look into the use of options<BR>with
the compiler to optimize the code? If so, where can I get a list
of<BR>what each of the options, and their meaning, (gcc and ccc). Also,
will<BR>these same options apply when I build the code that includes the
message<BR>passing to utilze the beowulf.<BR><BR>Additionally, I tested the
compiler made by compaq (ccc) to see if the gcc<BR>compiler might be the
problem. Using this compiler (ccc)to build the<BR>source with no compiler
options, I again to see the speed differences.<BR>Again, the speeds were
approximately the same as the gcc compiler. This is<BR>unacceptable.
In fact, the four node beowulf configured this way will at<BR>best be slightly
faster than the single digital unix machines that are<BR>currently used to run
our simulations. Any insight or suggestions as to<BR>the source of my
problem or possible solutions is gladly appreciated.<BR><BR><BR>Matt
Lee<BR>Undergraduate Research Assistant<BR>Mechanical and Aerospace
Engineering<BR>Oklahoma State
University<BR><BR><BR><BR></FONT></DIV></BODY></HTML>