[gmx-users] Well known domain decomposition
mark.j.abraham at gmail.com
Sat Apr 21 16:27:31 CEST 2018
You should consult your cluster's docs to see how to submit, say, a single
node job with 8 MPI ranks and two cores per rank. Having done so, e.g.
mpirun gmx_mpi mdrun
will just honour that. But maybe you want to be more explicit with
mpirun -np 8 gmx_mpi mdrun -ntomp 2
Then later you might consider using two nodes and 4 threads per rank.
Note that there are several examples in the user guide, to help you with
On Sat, Apr 21, 2018, 15:07 Alex <alexanderwien2k at gmail.com> wrote:
> Dear all,
> The cluster where my gromac job is submitted on has several nodes each with
> 16 slots.
> My calculation works fine when it is submitted on a single node using 8
> slots out of 16, but it crashes when the 16 slots in a single node are used
> due to the well known domain decomposition issue.
> It also crashes with the same error when it is summited over more than one
> node, using 32 slots.
> I played blindly around with different available options of -ntomp/
> -npme/-ntomp_pme/ ... or also export OMP_NUM_THREADS=8 but no success. I
> tried the -nt = 8 in single node also but it does not work as the gromacs
> had not been compiled without thread-MPI in our machine.
> Actually I can not do that much about the compilation or recompile it again
> with proper options as it has been compiled by our IT administration to be
> used with VOTCA program (in my case),
> So, in such cluster, would you please help me to figure out the problem and
> run the simulation by higher than 8 number of slots? at least with 16 slots
> in a single node or even better with two or 4 nodes.
> Thank you.
> Gromacs Users mailing list
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
More information about the gromacs.org_gmx-users