[Beowulf] Re: MPI & ScaLAPACK: error in MPI_Comm_size: Invalid communicator

cjoung at REDACTED cjoung at REDACTED
Mon Oct 18 21:59:55 PDT 2004


Hi again - I need to correct something in my earlier email - I left
out the subroutine SL_INIT,

Here is the entire code again,

****example2.f***************
      program example2
      integer ictxt,npcol,nprow

      nprow=2
      nocol=3

      call sl_init(ictxt,nprow,npcol)

      call BLACS_EXIT(0)
      stop
      end

      subroutine sl_init(ictxt,nprow,npcol)
      integer ictxt,nprow,npcol,iam,nprocs
      external BLACS_GET,BLACS_GRIDINIT,BLACS_PINFO,BLACS_SETUP

      call BLACS_PINFO(iam,nprocs)
        
      if (nprocs.lt.1) then
        if (iam.eq.0) nprocs=nprow*npcol
        call BLACS_SETUP(iam,nprocs)
      endif

      call BLACS_GET(-1,0,ictxt)
      call BLACS_GRIDINIT(ictxt,'Row-major',nprow,npcol)

      return
      end
*****************************

The errors are still the same however - it doesn't like any of my
BLACS calls.

Any help would be greatly appreciated,
thanks
Clint Joung



----- Forwarded message from cjoung at REDACTED -----
    Date: Tue, 19 Oct 2004 12:14:53 +1000
    From: cjoung at REDACTED
 Subject: MPI & ScaLAPACK: error in MPI_Comm_size: Invalid communicator
      To: beowulf at beowulf.org

Hi, I was hoping someone could help me with a F77,MPI & ScaLAPACK
problem. 

<snip >

Yet, it still doesn't work!, the following is the output
when I try to compile and run it,
*********************************************************
[tony at carmine clint]$ mpif77 -o example2 example2.f 
                             -L/opt/intel/mkl70cluster/lib/32 
                             -lmkl_scalapack  
                             -lmkl_blacsF77init 
                             -lmkl_blacs 
                             -lmkl_blacsF77init 
                             -lmkl_lapack 
                             -lmkl_ia32 
                             -lguide 
                             -lpthread 
                             -static-libcxa
[tony at carmine clint]$ mpirun -n 6 ./example2 
aborting job: Fatal error in MPI_Comm_size: Invalid communicator, error stack:
MPI_Comm_size(82): MPI_Comm_size(comm=0x5b, size=0x80d807c) failed
MPI_Comm_size(66): Null Comm pointer
aborting job:
Fatal error in MPI_Comm_size: Invalid communicator, error stack:
MPI_Comm_size(82): MPI_Comm_size(comm=0x5b, size=0x80d807c) failed
MPI_Comm_size(66): Null Comm pointer
rank 5 in job 17  carmine.soprano.org_32782   caused collective abort of all ranks
  exit status of rank 5: return code 13
aborting job:
Fatal error in MPI_Comm_size: Invalid communicator, error stack:
MPI_Comm_size(82): MPI_Comm_size(comm=0x5b, size=0x80d807c) failed
MPI_Comm_size(66): Null Comm pointer
rank 1 in job 17  carmine.soprano.org_32782   caused collective abort of all ranks
  exit status of rank 1: return code 13
rank 0 in job 17  carmine.soprano.org_32782   caused collective abort of all ranks
  exit status of rank 0: return code 13
[tony at carmine clint]$
*********************************************************
.so apparently somethings wrong with MPI_Comm_size, but
beyond that, I can't figure it out.

<snip>



More information about the Beowulf mailing list