[gmx-developers] MPI issues with Gromacs 4
rainy908 at yahoo.com
rainy908 at yahoo.com
Mon Aug 3 06:12:12 CEST 2009
Hi,
Hi,
I'm having MPI compatibility issues while running Gromacs v4.0.4. I've
tried using openmpi/bin/mpirun, openmpi-1.2.7-intel-mx2g/bin/mpirun, and
openmpi-intel-mx2g/bin/mpirun, but none of them seem to be compatible - they
caused my simulation to crash due to the error:
"Program mdrun_mpi, VERSION 3.3.1
Source code file: init.c, line: 69
Fatal error:
run input file md3_2.tpr was made for 4 nodes,
while mdrun_mpi expected it to be for 1 nodes."
Does anyone have insight on this problem? Thanks.
2009/8/2 Mark Abraham <Mark.Abraham at anu.edu.au>
> rainy908 at yahoo.com wrote:
>
>> So I've been able to get access to Gromacs v4.0.4 on another supercomputer
>> cluster. However, I've been told that there are compatibility issues
>> regarding the MPI with Gromacs v4. Also, I'm using the MARTINI force
>> field
>> with Gromacs, but I'm not sure how well it is tested with v4?
>>
>
> Please start a new email with a new subject when you change topics. You
> should also try to search the archives for clues on these topics. I'm not
> aware of general problems with modern MPI libraries and GROMACS 4. The use
> GROMACS makes of MPI functionality is quite undemanding.
>
> Mark\
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-developers/attachments/20090802/c773f6e9/attachment.html>
More information about the gromacs.org_gmx-developers
mailing list