PGI User Forum
 SearchSearch   MemberlistMemberlist     RegisterRegister   ProfileProfile    Log inLog in 

CUDA-x86.

OpenMPI and MVAPICH2

 
Post new topic   Reply to topic    PGI User Forum Forum Index -> Debugging and Profiling
View previous topic :: View next topic  
Author Message
bwb



Joined: 24 Feb 2012
Posts: 6

PostPosted: Fri Feb 24, 2012 6:49 am    Post subject: OpenMPI and MVAPICH2 Reply with quote

Dear users and developers,

We are running a 16 blade cluster with four AMD Opteron 12 core CPUs per node and infiniband interconnect with the CDK 11.10-0 release.
OpenMPI 1.4.4 and MVAPICH2-1.7 are installed. However MPI debugging and profiling is working only partially:
Debugging with MVAPICH2 works just by invoking the debugger via:
pgdbg -mpi:mpiexec -np 4 ./a.out.
With OpenMPI it does not since the mpiexec seems to work different. The PGITools manual suggests to find the environment variables which are associated with PGDBG_MPI_RANK_ENV and PGDBG_MPI_SIZE_ENV and set them accordingly. For OpenMPI these are OMPI_COMM_WORLD_RANK and OMP_COMM_WORLD_SIZE. To my opinion the actual problem is passing them to the PGDBG variables properly.
Are there any ideas if this will make the debugger work or who this can be achieved?
The situation for the profiler is similar:
Profiling with MVAPICH2 works for C code if it is compiled with e.g. Mprof=mpich2,lines, but it does not for Fortran.
Profiling with OpenMPI works for C code after editing the compiler wrapper data files of the OpenMPI installation as described in the PGITools manual, but doing the same for the Fortran wrapper data files enables profiling for process 0 only.
Are there any ideas or did anyone encounter the same issue?

Thanks in advance

BWB
Back to top
View user's profile
donb



Joined: 20 Jul 2004
Posts: 88
Location: The Portland Group, Inc.

PostPosted: Mon Feb 27, 2012 4:50 pm    Post subject: Reply with quote

Dear BWB:

Sorry you are having trouble with the tools.

First, the PGI tools aren't officially supported with MVAPICH2. Until such time as we support it, we can't provide much help there.

With respect to OpenMPI, that is officially supported. We are looking into the behaviour that you are reporting and will reply again once we have determined what is going on.

--Don
Back to top
View user's profile
donb



Joined: 20 Jul 2004
Posts: 88
Location: The Portland Group, Inc.

PostPosted: Thu Mar 01, 2012 12:13 pm    Post subject: Reply with quote

Here is a workaround for the OpenMPI debugging issue:

To be able to debug OpenMPI programs using PGDBG, PGDBG-specific MPI
environment variables: PGDBG_MPI_RANK_ENV and PGDBG_MPI_SIZE_ENV, shall be
set to corresponding OpenMPI environment variables. A user can debug OpenMPI
programs using following command:

pgdbg -mpi:<absolute_path_to_mpiexec> -x PGDBG_MPI_RANK_ENV=OMPI_COMM_WORLD_RANK -x
PGDBG_MPI_SIZE_ENV=OMPI_COMM_WORLD_SIZE <other_mpiexec_parameters>
<executable_path>

We are continuing to investigate the OpenMPI / Fortran profiling issue.
Back to top
View user's profile
bwb



Joined: 24 Feb 2012
Posts: 6

PostPosted: Mon Mar 05, 2012 5:15 am    Post subject: Reply with quote

Forwarding the environment variables as mentioned works perfectly.
Back to top
View user's profile
Display posts from previous:   
Post new topic   Reply to topic    PGI User Forum Forum Index -> Debugging and Profiling All times are GMT - 7 Hours
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum


Powered by phpBB © phpBB Group