[gmx-users] Low cpu utilization

Mahmood Naderan mahmood.nt at gmail.com
Mon Oct 17 09:56:07 CEST 2016


​Here is what I did...
I changed the cutoff-method to Verlet as suggested by
http://www.gromacs.org/Documentation/Cut-off_schemes#
How_to_use_the_Verlet_scheme


Then I followed two scenarios:

1) On the frontend, where gromacs and openmpi have been installed, I ran

​mahmood at cluster:LPN$ date
Mon Oct 17 11:06:40 2016
mahmood at cluster:LPN$ /share/apps/computer/openmpi-2.0.1/bin/mpirun -np 2
/share/apps/chemistry/gromacs-5.1/bin/mdrun_mpi -v
...
...
starting mdrun 'Protein in water'
50000000 steps,  50000.0 ps.
step 0
[cluster.scu.ac.ir:28044] 1 more process has sent help message
help-mpi-btl-base.txt / btl:no-nics
[cluster.scu.ac.ir:28044] Set MCA parameter "orte_base_help_aggregate" to 0
to see all help / error messages
imb F  0% step 100, will finish Tue Dec  6 11:41:44 2016
imb F  0% step 200, will finish Sun Dec  4 23:06:02 2016
^Cmahmood at cluster:LPN$ date
Mon Oct 17 11:07:01 2016


​So, roughly 21 seconds for about 200 steps. As I checked 'top' command,
two cpus were 100%. Full log is available at http://pastebin.com/CzViEmRb



2) I specified two nodes instead of the frontend. Two nodes have at least
one free core. So, one process on each of them is similar to the previous
scenario.

mahmood at cluster:LPN$ cat hosts.txt
compute-0-2
compute-0-1
mahmood at cluster:LPN$ date
Mon Oct 17 11:12:34 2016
mahmood at cluster:LPN$ /share/apps/computer/openmpi-2.0.1/bin/mpirun -np 2
--hostfile hosts.txt /share/apps/chemistry/gromacs-5.1/bin/mdrun_mpi -v
...
...
starting mdrun 'Protein in water'
50000000 steps,  50000.0 ps.
step 0
^CKilled by signal 2.
Killed by signal 2.
mahmood at cluster:LPN$ date
Mon Oct 17 11:15:47 2016

So, roughly 3 minutes without any progress!! As I ssh'ed to compute-0-2,
the 'top' command shows

23153 mahmood   39  19  190m  15m 6080 R  1.3  0.0   0:00.39 mdrun_mpi
23154 mahmood   39  19  190m  16m 5700 R  1.3  0.0   0:00.39 mdrun_mpi

And that is very very low cpu utilization. Please see the log at
http://pastebin.com/MZbjK4vD



Any idea is welcomed.
​




Regards,
Mahmood


More information about the gromacs.org_gmx-users mailing list