[Beowulf] g77 limits...
Robert G. Brown
rgb at phy.duke.edu
Thu Feb 23 20:11:17 PST 2006
On Thu, 23 Feb 2006, Jim Lux wrote:
> At 12:15 PM 2/23/2006, Robert G. Brown wrote:
>> On Thu, 23 Feb 2006, Jim Lux wrote:
>>
>>> All art and creativity is enhanced by limititations of the medium. I, for
>>> one, find that the fact that FORTRAN has a native complex type (since IV
>>> days, and maybe even FORTRAN II had it) and exponentiation as a native
>>> operator has been more useful than structs and pointers (all that dynamic
>>> allocations stuff just gets you into trouble) which only get you into
>>> trouble anyway). And FORTRAN does have the EQUIVALENCE statement, which,
>>> especially with named common, can be used to create a form of struct.
>>
>> Yeah, but try allocating a triangular array to hold Y(l,m) efficiently,
>> with l \in (0,l_max) and m \in (-l,l). Try doing that for I(L,L',L")
>> (Gaunt numbers or 9j symbol tables). Try allocating structs that have
>> object metadata as well as the actual data. Try building or working
>> with linked lists, trees, all sorts of recursive data structures.
>
>
> Hold triangular arrays efficiently? There's zeros in the array, why not
> store them in memory? If a problem is worth doing, it's worth doing with
> brute force. If the hammer you have isn't big enough, buy a bigger
> hammer.(or supercomputer, as it were). Of such a straightforward analysis
> did Seymour Cray make his fame.
Gee, why didn't I think of that? Oh, wait, back when I USED fortran for
this sort of thing that's what I did, perforce. And heck, granting
agencies are so dumb -- they won't mind buying me a cluster of 16 GB
nodes just because I'm working with sparse datasets... will they? ;-)
> As for metadata, that's what the chalkboard in your office is for, isn't it?
No, no, no. That's what the non-eraseable marker is for my floppy
disks. Or CD-R's, as the case may be. The chalkboard in my office
contains a list of all the books I've loaned out to students mixed in
with pictures my kids drew ten years ago and sundry antique
calculations. If it were ever erased and used for something like
metadata I'd have to just kill myself.
> Next thing you know, you'll be advocating that we can split up big problems
> into lots of little ones and use a room full of these toy things called PCs
> to do the calculations.
You should talk -- you're the one who wants to run them on a stack of
these toy things called "toy things" -- like gameboy advanced cpus or
PDAs -- right? ;-)
But only if they can be programmed in C....
> True enough. But there might be a limit to this. Consider the gyrations
> that any language goes through to provide a simple representation of certain
> operations that are efficient on the target computer. A good example would be
> FFT butterflies on a DSP oriented CPU (with hardware that supports stride
> indexing and in-place bitreversed addressing), or even certain vectorizable
> computations (filters with long sequences of multiply and accumulate).
I understand this. Indeed, for a long time that was fortran's real
strength -- it ran super-well on a vector machine because all it did was
arithmetic on arrays that had to be defined fortran style, that is to
say statically for most practical purposes. You could optimize the hell
out of this at the compiler level, and C often sucked in comparison on
this kind of operation.
On a non-vector machine the advantage wasn't terribly pronounced, and
then improvements in C started to match the improvements in CPU design
-- SSE support, etc. At this point you probably have MORE control over
core loop design in C than in fortran -- at least a better idea of what
is really going on on the CPU in any given code block, since C is that
"thin veneer of upper level language sensibility on top of raw
assembler" kinda thing. But I wouldn't be surprised if DSPs (being
heavily vector arithmetic kinds of things) "like" fortran-like
contructs.
This is something that I just live with when programming. I have some
real live code that involves LL'L" indexing -- six indices in a
triply-triangular array. It turns out to be very difficult
topologically to go through a mere THREE dimensional array in vectors --
if you consider e.g. nearest neighbor sites to any i,j,k, at most one
dimension of them can be "local"; the other two are displaced by at
least L and L^2 respectively -- running through simple sums over nearest
neighbors thereby cannot avoid thrashing the cache a bit. Checkerboard
algorithms and the like can help some, but to solve the nonlocality
problem is a real headache. Fortran won't do any better -- it might
well do worse, since if you DO repack a triangular sextupular array, you
might get it down to a page in size where if you leave it unpacked and
pad with zeros it spans multiple pages.
I actually have given a bit of thought to the notion of designing a
multidimensional system memory -- one with a triple addessing system
that actually has intrinsic ability to deliver memory stripes in all
three directions, integrated with a tripled memory subsystem that feeds
three caches into a CPU with tripled CPU pipelines. Do vector
calculations involving NN arithmetic in three dimensions! For some
classes of problems this would produce a really, really significant
performance boost, I think. In fact, since there are some problems
(lattice gauge theory, for example) that are typically computed on a 4
(or 4+1) dimensional lattice, going one or two dimensions higher might
even be worthwhile.
So let's see, this would only cost what, 4 to 6 billion dollars for a
foundry? Or is this estimate light? Clearly worth it.
> Actually, I find it amazing that there's still an awful lot of brand new
> FORTRAN code being written, considering that the language was invented to
> solve the problems that were current in the late 50s (and had research
> budgets to pay for them), and has all these clunky aspects.
Absolutely in-credible. In so many ways.
And the thing is, you can talk to some of the people who are still
writing that code, and say hey, wassup with this fortran thing, why
aren't you using C, and they'll actually look GUILTY about it. I mean,
they won't meet your eye while talking about it -- eyes darting all over
the room, hands fidgeting. They KNOW better, damn it, and do it anyway!
They are actually ASHAMED of themselves, and they do it anyway! Then
they get together with others who share the same perversion and chuckle
together about how they once again resisted the temptation to give up
Fortran and program in something a bit more modern. I dunno, PL1 or
something.
I personally think that they must have done SOMETHING that stimulated
the addiction centers in their brains while coding back when they were
growing up. They get some sort of dopamine rush from it. Or else it is
like tattooing yourself in primitive cultures to prove your manhood,
body piercing.
They need a twelve step plan. Or keyboards that deliver a nasty but
nonfatal shock anytime the space bar is depressed six times without an
intervening character. And NOBODY actually needs the caps lock button
anymore (which is invariably where the CONTROL shift is supposed to be
anyway, damn it! IBM clearly got paid off by the docs who take care of
carpal tunnel problems, way back in the 80's.)
> And yes, while it's clunky, FORTRAN is no worse than, say, assembler, for
ROTFL:-) This one line deserves to be written 10^22 times in a do loop
by all those ashamed programmers. One that uses the 10 CONTINUE form,
nested however many times it takes to do 10^22 iterations with INTEGER
types.
> some of these non-optimal applications: Over the past 30 or so mis-spent
> years, I've written, in FORTRAN, a couple compilers, a couple quasi RealTime
> OS kernels, a B-tree based file system with transparent mirrors and caches
> with multiple simultaneous accesses on dual port drives, a variety of
> debuggers, an 8086 simulator, and a optimum path router for aircraft. I am
> heartily glad that should I have to duplicate any of these things, I wouldn't
> have to use FORTRAN today. But, back in the day, FORTRAN is what you had
> (Imagine writing a shared filesystem in COBOL!), and it was a heck of a lot
> easier to use than, e.g. BAL.
You have my sincere and heartfelt sympathies. I've done enough similar
things to be able to share your pain, although not (recently) with
Fortran.
DeSmet C on the good old IBM PC back in maybe 1984. Cost $100 and was
worth every penny, and by 1986 or 87 I was using Unix and the real
thing, with jove as an editor and everything. The fortran I've
written/read post 1986 can be reduced to -- well, not QUITE none, but
none willingly, and mostly in the context of helping hapless students or
postdocs with programs they for whatever crazed reason wrote in fortran,
or trying to convert Dark Evil (like the original diehard program) into
C.
rgb
>
>
>
> James Lux, P.E.
> Spacecraft Radio Frequency Subsystems Group
> Flight Communications Systems Section
> Jet Propulsion Laboratory, Mail Stop 161-213
> 4800 Oak Grove Drive
> Pasadena CA 91109
> tel: (818)354-2075
> fax: (818)393-6875
>
>
--
Robert G. Brown http://www.phy.duke.edu/~rgb/
Duke University Dept. of Physics, Box 90305
Durham, N.C. 27708-0305
Phone: 1-919-660-2567 Fax: 919-660-2525 email:rgb at phy.duke.edu
More information about the Beowulf
mailing list