[gmx-users] REMD mdrun_mpi error

Mark Abraham mark.j.abraham at gmail.com
Tue Jun 23 12:47:19 CEST 2015


Hi,

Do your individual replica .tpr files run correctly on their own?

Mark

On Mon, Jun 22, 2015 at 3:35 PM Nawel Mele <nawel.mele at gmail.com> wrote:

> Dear gromacs users,
>
> I am trying to simulate a ligand using REMD method in explicit solvent with
> the charmm force field. When I try to equilibrate my system I get this
> error :
>
> Double sids (0, 1) for atom 26
> Double sids (0, 1) for atom 27
> Double sids (0, 1) for atom 28
> Double sids (0, 1) for atom 29
> Double sids (0, 1) for atom 30
> Double sids (0, 1) for atom 31
> Double sids (0, 1) for atom 32
> Double sids (0, 1) for atom 33
> Double sids (0, 1) for atom 34
> Double sids (0, 1) for atom 35
> Double sids (0, 1) for atom 36
> Double sids (0, 1) for atom 37
> Double sids (0, 1) for atom 38
> Double sids (0, 1) for atom 39
> Double sids (0, 1) for atom 40
>
> -------------------------------------------------------
> Program mdrun_mpi, VERSION 4.6.5
> Source code file:
> /local/software/gromacs/4.6.5/source/gromacs-4.6.5/src/gmxlib/invblock.c,
> line: 99
>
> Fatal error:
> Double entries in block structure. Item 53 is in blocks 1 and 0
>  Cannot make an unambiguous inverse block.
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
>
>
>
> *My mdp input file looks like this :*
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> *title           = CHARMM compound NVT equilibration define          =
> -DPOSRES      ; position restrain the protein; Run
> parametersintegrator      = sd            ; leap-frog stochastic dynamics
> integratornsteps          = 1000000       ; 2 * 1000000 = 100
> psdt              = 0.002         ; 2 fs; Output controlnstxout         =
> 500           ; save coordinates every 0.2 psnstvout         =
> 100000        ; save velocities every 0.2 psnstenergy       = 500
> ; save energies every 0.2 psnstlog          = 500           ; update log
> file every 0.2 ps; Bond parameterscontinuation    = no            ; first
> dynamics runconstraint_algorithm = SHAKE    ; holonomic constraints
> constraints     = h-bonds       ; all bonds (even heavy atom-H bonds)
> constrainedshake-tol       = 0.00001       ; relative tolerance for SHAKE;
> Neighborsearchingns_type         = grid          ; search neighboring grid
> cellsnstlist         = 5             ; 10 fsrlist           = 1.0
> ; short-range neighborlist cutoff (in nm)rcoulomb        = 1.0           ;
> short-range electrostatic cutoff (in nm)rvdw            = 1.0           ;
> short-range van der Waals cutoff (in nm); Electrostaticscoulombtype     =
> PME           ; Particle Mesh Ewald for long-range
> electrostaticspme_order       = 4             ; Interpolation order for
> PME. 4 equals cubic interpolationfourierspacing  = 0.16          ; grid
> spacing for FFT; Temperature coupling is on;tcoupl         = V-rescale
> ; modified Berendsen thermostattc-grps         = LIG SOL       ; two
> coupling groups - more accuratetau_t           = 1.0   1.0     ; time
> constant, in psref_t           = XXXXX         XXXXX   ; reference
> temperature, one for each group, in K;Langevin dynamicsbd-fric         = 0
> ;           ;Brownian dynamics friction coefficient. ld-seed
> =-1;            ;pseudo random seed is used; Pressure coupling is
> offpcoupl          = no            ; no pressure coupling in NVT; Periodic
> boundary conditionspbc             = xyz           ; 3-D PBC; Dispersion
> correctionDispCorr        = EnerPres      ; account for cut-off vdW scheme;
> Velocity generationgen_vel         = yes           ; assign velocities from
> Maxwell distributiongen_temp        = 0.0           ; temperature for
> Maxwell distributiongen_seed        = -1            ; generate a random
> seed*
>
>
> *And my input file to run it in parallel looks like that:*
>
>
>
>
>
>
>
>
>
>
> *#!/bin/bash#PBS -l nodes=3:ppn=16#PBS -l walltime=00:10:00#PBS -o
> zzz.qsub.out#PBS -e zzz.qsub.errmodule load openmpi     module load
> gromacs/4.6.5mpirun -np 48  mdrun_mpi -s eq_.tpr -multi 48 -replex 100000
> >& faillog-X.log*
>
>
> Does anyone have seen this issue before??
>
> Many thanks,
> --
>
> Nawel Mele, PhD Research Student
>
> Jonathan Essex Group, School of Chemistry
>
> University of Southampton,  Highfield
>
> Southampton, SO17 1BJ
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>


More information about the gromacs.org_gmx-users mailing list