[gmx-users] Eror: No parallel Ewald use PME instead!

Ali Alizadeh ali.alizadehmojarad at gmail.com
Wed Jun 4 14:29:56 CEST 2014


Dear Mark,

In my first simulation we had these lines:

; Electrostatics
coulombtype    = PME        ; Particle Mesh Ewald for long-range
electrostatics
pme_order    = 4        ; cubic interpolation
fourierspacing    = 0.12        ; grid spacing for FFT

-----------------------

; Electrostatics
coulombtype    = ewald        ; Particle Mesh Ewald for long-range
electrostatics
fourierspacing    = 0.6        ; grid spacing for FFT

-----------------

My system in 2nd case is equilibrated.

On Wed, Jun 4, 2014 at 4:35 PM, Mark Abraham <mark.j.abraham at gmail.com>
wrote:

>
>
>
> On Wed, Jun 4, 2014 at 12:42 PM, Ali Alizadeh <
> ali.alizadehmojarad at gmail.com> wrote:
>
>> Dear Mark,
>>
>> Thank you for your reply.
>>
>> When you do not change any parameters except one thing in mdp file(PME to
>>
>> ewald as the paper said that)and then your system converges, for me it means the problem is related to
>>
>> choosing the right method for  calculation of the electrostatic potential energy. What's your point on it?
>>
>>
> My point is that a flawed model physics, or a flawed system preparation,
> can get lucky and run stably, or not. On the information given, there is no
> reason to suppose that changing the kind of reciprocal-space approximation
> is relevant. Last time I heard a report like this, the person had changed
> various nonbonded settings "to make things run faster," and in so doing
> produced a junk model physics.
>
> Mark
>
>
>> -----
>>
>> Mark wrote:
>>
>> Hi,
>>
>> What makes you think the PME algorithm, rather than your choices of
>> settings for it (and twenty other things), is the problem?
>>
>> Mark
>>
>>
>> On Wed, Jun 4, 2014 at 7:57 AM, Ali Alizadeh <ali.alizadehmojarad at gmail.com <https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users>>
>> wrote:
>>
>> >* Dear All users,
>> *
>> >>* I have encounter this error:
>> *>>* -------------
>> *>* Fatal error:
>> *>* No parallel Ewald. Use PME instead.
>> *>* --------------
>> *>>* I used 8 cores and gromacs 4.5.5, for doing my calculations. I should use
>> *>* ewald method because
>> *>>* my system can not converge using PME it heats up my system. Are there any
>> *>* suggestions?
>> *>>>>* --
>> *>* Sincerely
>> *>>* Ali Alizadeh
>> *>* --*
>>
>>
>> --
>> Sincerely
>>
>> Ali Alizadeh
>>
>
>


-- 
Sincerely

Ali Alizadeh


More information about the gromacs.org_gmx-users mailing list