[gmx-developers] gromacs.org_gmx-developers Digest, Vol 182, Issue 2
1004753465
1004753465 at qq.com
Wed Jun 5 13:05:02 CEST 2019
Hi,
Thank you very much! It really helps. I checked PAR(cr), and it turns out to be false. Also, the DOMAINDECOMP(cr) is also false. It seems to me that, in multisim case, dd_collect_state is not available anymore.
Does it show that we do not have domain decomposition with multisim? If so, how do I reach the local state, local index, mdatoms and all the things related to dd_partition_system?
Does it mean that we always share the storage without communication using MPI? To be more specific, how do we get the information of a certain atom, e.g. its position, velocity, etc......
These questions are probably stupid but I indeed found the parallelization method in multisim case confusing and any help will be appreciated...
Thank you so much!!!
Best regards,
Huan
------------------ Original ------------------
From: "gromacs.org_gmx-developers-request"<gromacs.org_gmx-developers-request at maillist.sys.kth.se>;
Date: Wed, Jun 5, 2019 06:00 PM
To: "gromacs.org_gmx-developers"<gromacs.org_gmx-developers at maillist.sys.kth.se>;
Subject: gromacs.org_gmx-developers Digest, Vol 182, Issue 2
Send gromacs.org_gmx-developers mailing list submissions to
gromacs.org_gmx-developers at maillist.sys.kth.se
To subscribe or unsubscribe via the World Wide Web, visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers
or, via email, send a message with subject or body 'help' to
gromacs.org_gmx-developers-request at maillist.sys.kth.se
You can reach the person managing the list at
gromacs.org_gmx-developers-owner at maillist.sys.kth.se
When replying, please edit your Subject line so it is more specific
than "Re: Contents of gromacs.org_gmx-developers digest..."
Today's Topics:
1. Re: mdrun_mpi not able to reach "rank" (Mark Abraham)
2. Upcoming 2019.3 patch release (Paul bauer)
----------------------------------------------------------------------
Message: 1
Date: Wed, 5 Jun 2019 08:31:25 +0200
From: Mark Abraham <mark.j.abraham at gmail.com>
To: Discussion list for GROMACS development
<gmx-developers at gromacs.org>
Cc: "gromacs.org_gmx-developers"
<gromacs.org_gmx-developers at maillist.sys.kth.se>
Subject: Re: [gmx-developers] mdrun_mpi not able to reach "rank"
Message-ID:
<CAMNuMARw-p5QYtWuK_cbLnznYMKrr_3Y4-XAn8ijK0yz9bE=nw at mail.gmail.com>
Content-Type: text/plain; charset="utf-8"
Hi,
In your two cases, the form of parallelism is different. In the latter, if
you are using two ranks with thread-MPI, then you cannot be using multisim,
so there is more than one rank for the single simulation in use.
The PAR(cr) macro (sadly, misnamed for historical reasons) reflects whether
there is more than one rank per simulation, so you should be check that,
before using e.g. the functions in gromacs/gmxlib/network.h to gather some
information to the ranks that are master of each simulation. There's other
functions for communicating between master ranks of multi-simulations (e.g.
see the REMD code)
Mark
On Wed, 5 Jun 2019 at 07:54, 1004753465 <1004753465 at qq.com> wrote:
> Hi everyone,
>
> I am currently trying to run two Gromacs 2018 parallel processes by using
>
> mpirun -np 2 ...(some path)/mdrun_mpi -v -multidir sim[01]
>
> During the simulation, I need to collect some information to the two
> master nodes, just like the function "dd_gather". Therefore, I need to
> reach (cr->dd) for each rank. However, whenever I want to print
> "cr->dd->rank" or "cr->dd->nnodes"or some thing like that, it just shows
>
> [c15:31936] *** Process received signal ***
> [c15:31936] Signal: Segmentation fault (11)
> [c15:31936] Signal code: Address not mapped (1)
> [c15:31936] Failing at address: 0x30
> [c15:31936] [ 0] /lib/x86_64-linux-gnu/libpthread.so.0(+0x10340)
> [0x7f7f9e374340]
> [c15:31936] [ 1]
> /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x468cfb]
> [c15:31936] [ 2]
> /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x40dd65]
> [c15:31936] [ 3]
> /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x42ca93]
> [c15:31936] [ 4]
> /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x416f7d]
> [c15:31936] [ 5]
> /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x41792c]
> [c15:31936] [ 6]
> /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x438756]
> [c15:31936] [ 7]
> /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x438b3e]
> [c15:31936] [ 8]
> /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x439a97]
> [c15:31936] [ 9] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf5)
> [0x7f7f9d591ec5]
> [c15:31936] [10]
> /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x40b93e]
> [c15:31936] *** End of error message ***
> step 0[c15:31935] *** Process received signal ***
> [c15:31935] Signal: Segmentation fault (11)
> [c15:31935] Signal code: Address not mapped (1)
> [c15:31935] Failing at address: 0x30
> [c15:31935] [ 0] /lib/x86_64-linux-gnu/libpthread.so.0(+0x10340)
> [0x7fb64892e340]
> [c15:31935] [ 1]
> /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x468cfb]
> [c15:31935] [ 2]
> /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x40dd65]
> [c15:31935] [ 3]
> /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x42ca93]
> [c15:31935] [ 4]
> /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x416f7d]
> [c15:31935] [ 5]
> /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x41792c]
> [c15:31935] [ 6]
> /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x438756]
> [c15:31935] [ 7]
> /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x438b3e]
> [c15:31935] [ 8]
> /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x439a97]
> [c15:31935] [ 9] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf5)
> [0x7fb647b4bec5]
> [c15:31935] [10]
> /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x40b93e]
> [c15:31935] *** End of error message ***
> --------------------------------------------------------------------------
> mpirun noticed that process rank 0 with PID 31935 on node c15.dynstar
> exited on signal 11 (Segmentation fault).
> --------------------------------------------------------------------------
>
> However, if I install the package without flag -DGMX_MPI=on, the single
> program(mdrun) runs smoothly. and all the domain decomposition rank can be
> printed out and used conveniently.
>
> It is pretty wierd to me that, with mdrun_mpi, although domain
> decomposition can be done, their rank can neither be printed out nor
> available through struct cr->dd. I wonder whether they were saved in other
> form, but I do not know what it is.
>
> I will appreciate it if someone can help. Thank you very much!!!
> Best,
> Huan
> --
> Gromacs Developers mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers
> or send a mail to gmx-developers-request at gromacs.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-developers/attachments/20190605/0f93c01b/attachment-0003.html>
------------------------------
Message: 2
Date: Wed, 5 Jun 2019 10:13:39 +0200
From: Paul bauer <paul.bauer.q at gmail.com>
To: gromacs.org_gmx-developers at maillist.sys.kth.se
Subject: [gmx-developers] Upcoming 2019.3 patch release
Message-ID: <3690c17f-283a-3c34-ab48-1e11b0ff0417 at gmail.com>
Content-Type: text/plain; charset=utf-8; format=flowed
Hello developers,
The next GROMACS patch release for the 2019 branch is planned for end of
next week around June 14.
The plan for those releases continues to be to have them released every
2-3 months to ensure users will get their hands on the latest fixes.
Please check out
https://redmine.gromacs.org/projects/gromacs/issues?fixed_version_id=89&set_filter=1&status_id=o
to see if there's something you can help out with fixing, or review at
https://gerrit.gromacs.org/#/q/status:open+project:gromacs+branch:release-2019.
Cheers
Paul
--
Paul Bauer, PhD
GROMACS Release Manager
KTH Stockholm, SciLifeLab
0046737308594
------------------------------
--
Gromacs Developers mailing list
* Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List before posting!
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers or send a mail to gmx-developers-request at gromacs.org.
End of gromacs.org_gmx-developers Digest, Vol 182, Issue 2
**********************************************************
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-developers/attachments/20190605/bfcd7b7d/attachment-0001.html>
More information about the gromacs.org_gmx-developers
mailing list