Max common block size, global array size on ia32

Chris Smith csmith at
Tue Jul 23 19:51:38 PDT 2002

You're actually hitting the data segment limit, not the stack limit. 

I brought the array declaration inside of main() in the C program to
force it to be allocated on the stack, bumped up the stack limit as you
did, and the program ran. Alternatively, you can use malloc() to
allocate this memory on the heap.

I tried bumping up the data segment limit (even though it was unlimited
already), but still had the seg fault. Maybe this is hard limit on the
way the process address space is carved up.

Sorry ... I don't know how to do the same kind of manipulation in

-- Chris

On Tue, 2002-07-23 at 16:43, Craig Tierney wrote:
> Sorry if this is a bit off topic.  I am not sure
> where to ask this question.  The following
> two codes fail on my system (Dual Xeon, 2 GB Ram,
> Linux-2.4.18, redhat 7.2).
> program memtest
> integer*8 size
> parameter(size=896*1024*1024)
> haracter a(size)   
> common /block/ a
> write(*,*) "hello"
> stop
> end
> OR:
> #include<stdio.h>
> #include<memory.h>
> char ar[896*1024*1024];
> int main() { printf("Hello\n"); }
> I get a segmentation fault before the codes
> start.  I can use ifc, icc, pgf77 and gcc and
> get the same results.   If I change the array size to 895 MB,
> the codes run.  If I change the C code to
> define the array as 'static char ar[blah]' I can
> allocate more than 895MB.
> I have bumped up the max stack size with:
> ulimit -Hs 2048000
> ulimit -s 2048000
> But this does not help.
> I cannot find anywhere in the linux source where
> the max stacksize might be set.  It seems that
> it might be tied to 1 GB, but I cannot find it.
> Does anyone know how I can get around this
> issue?
> Thanks,
> Craig
> -- 
> Craig Tierney (ctierney at
> _______________________________________________
> Beowulf mailing list, Beowulf at
> To change your subscription (digest mode or unsubscribe) visit

More information about the Beowulf mailing list