[gmx-users] Paralellization limit?

Andrea Carotti and.carotti at farmchim.uniba.it
Wed Apr 19 14:41:51 CEST 2006


Hi Carsten,
thanks for your quick reply.
Could you plese confrim me that gromacs 3.3.1 works and compile fine with 
mpich 2.x?
Cause this is the first time that I hear that from a user and I'll go ahead 
on this way instead of usin LAM.
Thanks again
Andrea
----- Original Message ----- 
From: "Carsten Kutzner" <ckutzne at gwdg.de>
To: "Discussion list for GROMACS users" <gmx-users at gromacs.org>
Sent: Wednesday, April 19, 2006 2:15 PM
Subject: Re: [gmx-users] Paralellization limit?


> Hi Andrea,
>
> Andrea Carotti wrote:
>> Hi all,
>> I'm trying to simulate a system with two identical proteins (42aa each 
>> one), solvent spc (18430 mols) and 6 ions NA+...for a total of  ~56100 
>> atoms.
>> Now the "problem" is that if I run the MD on 4 nodes everything works 
>> fine, but when I try to use 6 or 8 cpus the process stops on the master 
>> and continue only on the last two slaves (4 cpus, the systems are dual 
>> xeon). The problem is that I can't find any error message on the log 
>> files.
>> I've tried using the d.dppc benchmark files, and all goes fine with 2-4-6 
>> and 8 cpus.
>> So my question is: is there a limit of parallelization in gromacs 
>> depending on the simulated system?
>> I'm using gromacs 3.3.1, mpich 1.2.5.2 and fftw 2.1.5.
>
> I have encountered similar problems when using mpich 1.2.x on
> Ethernet. Upgrading to mpich-2.x or using LAM solved the problems.
> There is only a parallelisation limit in the sense that using more
> CPUs does not always result in faster execution. 56 k atoms should
> run happily on 8 CPUs.
>
> Good luck,
>   Carsten
> _______________________________________________
> gmx-users mailing list    gmx-users at gromacs.org
> http://www.gromacs.org/mailman/listinfo/gmx-users
> Please don't post (un)subscribe requests to the list. Use the www 
> interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php 




More information about the gromacs.org_gmx-users mailing list