[gmx-users] do_dssp strange reaction
Chalaoux, Francois-Regis
Francois-Regis.Chalaoux at evotec.com
Wed May 4 15:35:56 CEST 2016
Hi everybody,
Trying to use do_dssp_mpi for my MD I got an error described below.
I gave "-ver 2" for dssp 2.0.4 but nothing changed.
This is a real pain in the neck since a week to debug this message.
Any tracks Guys ?
FRC
Config:
Linux Centos 6.7
Gromacs mpi 5.04
==============
CMD and Error
==============
[fchalaoux at frtow0216 Complex3]$ do_dssp_mpi -f md_0_1.xtc -s md_0_1.tpr -sc scount.xvg -o ss.xpm -dt 10 -ver 2
GROMACS: gmx do_dssp, VERSION 5.0.4
GROMACS is written by:
Emile Apol Rossen Apostolov Herman J.C. Berendsen Par Bjelkmar
Aldert van Buuren Rudi van Drunen Anton Feenstra Sebastian Fritsch
Gerrit Groenhof Christoph Junghans Peter Kasson Carsten Kutzner
Per Larsson Justin A. Lemkul Magnus Lundborg Pieter Meulenhoff
Erik Marklund Teemu Murtola Szilard Pall Sander Pronk
Roland Schulz Alexey Shvetsov Michael Shirts Alfons Sijbers
Peter Tieleman Christian Wennberg Maarten Wolf
and the project leaders:
Mark Abraham, Berk Hess, Erik Lindahl, and David van der Spoel
Copyright (c) 1991-2000, University of Groningen, The Netherlands.
Copyright (c) 2001-2014, The GROMACS development team at
Uppsala University, Stockholm University and
the Royal Institute of Technology, Sweden.
check out http://www.gromacs.org for more information.
GROMACS is free software; you can redistribute it and/or modify it
under the terms of the GNU Lesser General Public License
as published by the Free Software Foundation; either version 2.1
of the License, or (at your option) any later version.
GROMACS: gmx do_dssp, VERSION 5.0.4
Executable: /site/tl/app/x86_64/discovery/academic/gromacs/5.0.4/icc-2015_mkl_openmpi-1.8.5_plumed-2.1.3-sp/bin/gmx_mpi
Library dir: /site/tl/app/x86_64/discovery/academic/gromacs/5.0.4/icc-2015_mkl_openmpi-1.8.5_plumed-2.1.3-sp/share/gromacs/top
Command line:
do_dssp_mpi -f md_0_1.xtc -s md_0_1.tpr -sc scount.xvg -o ss.xpm -dt 10 -ver 2
Reading file md_0_1.tpr, VERSION 5.0.4 (single precision)
Reading file md_0_1.tpr, VERSION 5.0.4 (single precision)
Group 0 ( System) has 47921 elements
Group 1 ( Protein) has 2614 elements
Group 2 ( Protein-H) has 1301 elements
Group 3 ( C-alpha) has 163 elements
Group 4 ( Backbone) has 489 elements
Group 5 ( MainChain) has 653 elements
Group 6 ( MainChain+Cb) has 805 elements
Group 7 ( MainChain+H) has 815 elements
Group 8 ( SideChain) has 1799 elements
Group 9 ( SideChain-H) has 648 elements
Group 10 ( Prot-Masses) has 2614 elements
Group 11 ( non-Protein) has 45307 elements
Group 12 ( Other) has 22 elements
Group 13 ( JZ4) has 22 elements
Group 14 ( CL) has 6 elements
Group 15 ( Water) has 45279 elements
Group 16 ( SOL) has 45279 elements
Group 17 ( non-Water) has 2642 elements
Group 18 ( Ion) has 6 elements
Group 19 ( JZ4) has 22 elements
Group 20 ( CL) has 6 elements
Group 21 ( Water_and_ions) has 45285 elements
Select a group: 5
Selected 5: 'MainChain'
There are 163 residues in your selected group
dssp cmd='/site/tl/app/x86_64/discovery/dssp/2.0.4/bin -i ddEJhJwS -o ddllnW5z > /dev/null 2> /dev/null'
Reading frame 0 time 0.000
Back Off! I just backed up ddEJhJwS to ./#ddEJhJwS.1#
-------------------------------------------------------
Program do_dssp_mpi, VERSION 5.0.4
Source code file: /site/tl/app/x86_64/discovery/academic/gromacs/5.0.4-src/gromacs-5.0.4/src/gromacs/gmxana/gmx_do_dssp.c, line: 670
Fatal error:
Failed to execute command: Try specifying your dssp version with the -ver option.
For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors
-------------------------------------------------------
Halting program do_dssp_mpi
gcq#48: "You Could Make More Money As a Butcher" (F. Zappa)
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode -1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
More information about the gromacs.org_gmx-users
mailing list