[gmx-users] Excessive compute time using PME
Jordi Camps
jcamps at lsi.upc.edu
Mon May 23 17:33:02 CEST 2005
Hello,
I started with a EM and a slightly bigger box (5% bigger). But some problem
remains:
Steepest Descents:
Tolerance (Fmax) = 1.00000e+03
Number of steps = 500
Fatal error: ci = -2147483648 should be in 0 .. 2001 [FILE nsgrid.c,
LINE 218]
Where is supposed to be the center of the box? Because in the the file, only
the vectors are given, but no reference point.
Thanks!
--
Jordi Camps Puchades
Instituto Nacional de Bioinformatica (INB) Nodo Computacional GNHC-2
UPC-CIRI
c/. Jordi Girona 1-3
Modul C6-E201 Tel. : 934 011 650
E-08034 Barcelona Fax : 934 017 014
Catalunya (Spain) e-mail: jcamps at lsi.upc.edu
> -----Mensaje original-----
> De: gmx-users-bounces at gromacs.org
> [mailto:gmx-users-bounces at gromacs.org] En nombre de David van
> der Spoel
> Enviado el: jueves, 19 de mayo de 2005 14:31
> Para: Discussion list for GROMACS users
> Asunto: RE: [gmx-users] Excessive compute time using PME
>
> On Thu, 2005-05-19 at 13:13 +0200, Jordi Camps wrote:
> > Hello,
> >
> > I have some news about the problem. Yesterday I saw that the input
> > that I was feeding was not completely contained in the
> octahedron, so
> > I recentered the travelling waters and now I have a true
> octahedron as input for mdrun.
> > If I chech the coordinates of the tpr file (extracted with
> editconf) I
> > continue seeing an octahedron in VMD.
> > But when I run the simulation with mdrun, it fails. And before core
> > dumping, mdrun writes two files: step-1.pdb and step0.pdb.
> > Step0.pdb shows an exploded molecule (and waters).
> > Step-1.pdb shows my input in cubic boundaries. I think that this is
> > the problem. I'm feeding an octahedron and mdrun is
> interpreting this
> > like a box. There are some special considerations I should
> take into
> > account if I want to simulate with an octahedron? Perhaps
> I'm missing
> > some parameter in the grompp.mdp file? The box vectors,
> like the ones
> > that I have in the file, are 6.72730 6.34256 5.49282
> 0.00000 0.00000
> > 2.24243 0.00000 -2.24243 3.17128.
> >
> > Thank you for yout help.
> start with EM, if necessary with slightly larger box.
> >
> > Sincerely,
> >
> > --
> > Jordi Camps Puchades
> > Instituto Nacional de Bioinformatica (INB) Nodo
> Computacional GNHC-2
> > UPC-CIRI
> > c/. Jordi Girona 1-3
> > Modul C6-E201 Tel. : 934 011 650
> > E-08034 Barcelona Fax : 934 017 014
> > Catalunya (Spain) e-mail: jcamps at lsi.upc.edu
> >
> > -----Mensaje original-----
> > De: gmx-users-bounces at gromacs.org
> > [mailto:gmx-users-bounces at gromacs.org] En nombre de Anton Feenstra
> > Enviado el: jueves, 19 de mayo de 2005 7:45
> > Para: Discussion list for GROMACS users
> > Asunto: Re: [gmx-users] Excessive compute time using PME
> >
> >
> > Jordi Camps wrote:
> >
> > > Hello list!
> > >
> > > Yes, you were right on the problem. I computed bad box
> vectors which
> > > makes the box extra-large. Now I reduced the box to it's
> real size,
> > > but now I get some kind of error. GXM writes an step-1.pdb and a
> > > step0.pdb files (the second one shows that all the molecules
> > > exploded) and core dumps. I'm working to find the cause.
> >
> > Loook at your periodic boundaries. If you changed the box size, you
> > may well have (bad) overlap at the 'edges'.
> >
> >
> --
> David.
> ______________________________________________________________
> __________
> David van der Spoel, PhD, Assoc. Prof., Molecular Biophysics
> group, Dept. of Cell and Molecular Biology, Uppsala University.
> Husargatan 3, Box 596, 75124 Uppsala, Sweden
> phone: 46 18 471 4205 fax: 46 18 511 755
> spoel at xray.bmc.uu.se spoel at gromacs.org
> http://xray.bmc.uu.se/~spoel
> ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
> ++++++++++
>
>
> _______________________________________________
> gmx-users mailing list
> gmx-users at gromacs.org
> http://www.gromacs.org/mailman/listinfo/gmx-users
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
>
More information about the gromacs.org_gmx-users
mailing list