[gmx-users] optimum number of OpenMP threads for simulation
Irem Altan
irem.altan at duke.edu
Tue Dec 20 21:09:00 CET 2016
Hi,
Is there an example usage of -multi or -multidir somewhere?
Best,
Irem
> On Dec 19, 2016, at 6:20 PM, Mark Abraham <mark.j.abraham at gmail.com> wrote:
>
> Hi,
>
> I haven't tried -multi with GROMACS features that hack their own file
> handling to make life hard for implementing -multi, but my recommendation
> of -multidir caters for that also.
>
> Mark
>
> On Tue, 20 Dec 2016 09:59 Irem Altan <irem.altan at duke.edu> wrote:
>
>> Hi,
>>
>> Yes, mdrun -multi is pretty tolerant (though I recommend mdrun -multidir
>> because it's easier to manage each simulation in a separate directory).
>>
>> For the case that they are in the same directory, would the following be
>> the correct command?
>>
>> ibrun gmx_mpi mdrun -v -multi -deffnm umbrella0 umbrella1 -pf
>> pullf-umbrella0.xvg pullf-umbrella1.xvg -px pullx-umbrella0.xvg
>> pullx-umbrella1.xvg
>>
>>
>>
>> The simulation that I mentioned is just one window of the umbrella sampling
>> I’m trying to do. Can I bundle two windows together? Otherwise, I can just
>> use the whole node and run a single simulation on 4 GPUs. I’m trying to use
>> half the node because running two simulations on half the node would be
>> faster than running one simulation on the entire node, as the simulation
>> speed doesn’t double when the resources are doubled.
>>
>>
>> Overall throughput will be highest when you use the least resources per
>> simulation - see
>> https://urldefense.proofpoint.com/v2/url?u=https-3A__arxiv.org_abs_1507.00898&d=CwIGaQ&c=imBPVzF25OnBgGmVOlcsiEgHoG1i6YHLR0Sj_gZ4adc&r=r1Wl_e-3DAvYeqhtCRi2Mbok8HBpo_RH4ll0E7Hffr4&m=mytl4-qY5Vf6V1T2o2yvZLPAe2RnOeikIXZZpYDhddk&s=bOfp2xFsTN87G9tx6kqxJ0MxcMglgh8u1tDDhXOybLY&e=
>>
>>
>> Thank you for the information.
>>
>> Best,
>> Irem
>> --
>> Gromacs Users mailing list
>>
>> * Please search the archive at
>> https://urldefense.proofpoint.com/v2/url?u=http-3A__www.gromacs.org_Support_Mailing-5FLists_GMX-2DUsers-5FList&d=CwIGaQ&c=imBPVzF25OnBgGmVOlcsiEgHoG1i6YHLR0Sj_gZ4adc&r=r1Wl_e-3DAvYeqhtCRi2Mbok8HBpo_RH4ll0E7Hffr4&m=cbWMbsBTFEzJ-vh-9w27UrMu3kOFXYuhMI6CN4DvTTY&s=M0EF97P3mkAmhAGmHiLOMC9ZbgdldHodAVwMFphJexc&e= before
>> posting!
>>
>> * Can't post? Read https://urldefense.proofpoint.com/v2/url?u=http-3A__www.gromacs.org_Support_Mailing-5FLists&d=CwIGaQ&c=imBPVzF25OnBgGmVOlcsiEgHoG1i6YHLR0Sj_gZ4adc&r=r1Wl_e-3DAvYeqhtCRi2Mbok8HBpo_RH4ll0E7Hffr4&m=cbWMbsBTFEzJ-vh-9w27UrMu3kOFXYuhMI6CN4DvTTY&s=KoJU5YsTU9gGEJV6boPjFAha2x6-eWAkAFu4C2j9yok&e=
>>
>> * For (un)subscribe requests visit
>> https://urldefense.proofpoint.com/v2/url?u=https-3A__maillist.sys.kth.se_mailman_listinfo_gromacs.org-5Fgmx-2Dusers&d=CwIGaQ&c=imBPVzF25OnBgGmVOlcsiEgHoG1i6YHLR0Sj_gZ4adc&r=r1Wl_e-3DAvYeqhtCRi2Mbok8HBpo_RH4ll0E7Hffr4&m=cbWMbsBTFEzJ-vh-9w27UrMu3kOFXYuhMI6CN4DvTTY&s=6dWKftJHhrWdBQrfp9M2Bd06OWo-zHhfEz0m_0RztOk&e= or
>> send a mail to gmx-users-request at gromacs.org.
> --
> Gromacs Users mailing list
>
> * Please search the archive at https://urldefense.proofpoint.com/v2/url?u=http-3A__www.gromacs.org_Support_Mailing-5FLists_GMX-2DUsers-5FList&d=CwIGaQ&c=imBPVzF25OnBgGmVOlcsiEgHoG1i6YHLR0Sj_gZ4adc&r=r1Wl_e-3DAvYeqhtCRi2Mbok8HBpo_RH4ll0E7Hffr4&m=cbWMbsBTFEzJ-vh-9w27UrMu3kOFXYuhMI6CN4DvTTY&s=M0EF97P3mkAmhAGmHiLOMC9ZbgdldHodAVwMFphJexc&e= before posting!
>
> * Can't post? Read https://urldefense.proofpoint.com/v2/url?u=http-3A__www.gromacs.org_Support_Mailing-5FLists&d=CwIGaQ&c=imBPVzF25OnBgGmVOlcsiEgHoG1i6YHLR0Sj_gZ4adc&r=r1Wl_e-3DAvYeqhtCRi2Mbok8HBpo_RH4ll0E7Hffr4&m=cbWMbsBTFEzJ-vh-9w27UrMu3kOFXYuhMI6CN4DvTTY&s=KoJU5YsTU9gGEJV6boPjFAha2x6-eWAkAFu4C2j9yok&e=
>
> * For (un)subscribe requests visit
> https://urldefense.proofpoint.com/v2/url?u=https-3A__maillist.sys.kth.se_mailman_listinfo_gromacs.org-5Fgmx-2Dusers&d=CwIGaQ&c=imBPVzF25OnBgGmVOlcsiEgHoG1i6YHLR0Sj_gZ4adc&r=r1Wl_e-3DAvYeqhtCRi2Mbok8HBpo_RH4ll0E7Hffr4&m=cbWMbsBTFEzJ-vh-9w27UrMu3kOFXYuhMI6CN4DvTTY&s=6dWKftJHhrWdBQrfp9M2Bd06OWo-zHhfEz0m_0RztOk&e= or send a mail to gmx-users-request at gromacs.org.
More information about the gromacs.org_gmx-users
mailing list