[gmx-users] problem with MPI,thread and number of CPU

khourshaeishargh at mech.sharif.ir khourshaeishargh at mech.sharif.ir
Mon May 16 21:12:42 CEST 2016



<style type="text/css">
p { margin-bottom: 0.1in; line-height: 120%; }a:link {  }</style>


	Dear Gromacs users


	During simulating DPPC membrane I confront with several Problems. I
intend to express them here and I really appreciate it if you help me.
Briefly I started to simulate a DPPC containing 2000 W and 128 DPPC
molecules. the trend ( I mean the way it increase or decrease) of
Pressure xx is important to me. as one can expect, due to the size of my
system ( number of atom) the result is so erratic  ( about 200bar).
so I enlarge the system in X-direction considering consistent trend for
both system. so I replicated system in X-direction, minimized it and do
an NPT on it.


	during NPT, my working job blew up with this error :


	An atom moved too far between two domain decomposition steps


	does this error relate to the number of cpu-cores, because when I
decrease number of the cores from 16 to 8 using -nb 8, I don&#39;t see
this error again, but there is no progress in the number of steps !!!


	also I have question about MPI. In my laptop when I simulate a box ( 128
DPPC with 2000 W), the system runs with :


	Using 1 MPI thread

	Using 4 OpenMP threads


	when I do the same, in High performance Computer of the university, Its
output is totally different which also uses automatically :


	Using 24 MPI threads

	Using 1 OpenMP thread per tMPI thread


	but when I used -ntmpi 1 to change MPI threads to 1, it gave the same
answer as my laptop. so what is the optimum number of -ntmpi and -nt ?! I
should notice that I now what is the meaning of MPI and thread, but
don&#39;t now what is the optimum number !!!!


	best  regards


	Ali


	==================


	Ali khourshaei shargh (khourshaeishargh at mech.sharif.ir)


	Department of Mechanical Engineering


	Sharif University of Technology, Tehran, Iran


	

	 


	

	 

	 



More information about the gromacs.org_gmx-users mailing list