[gmx-users] Simplified picture: comments invited
Mark Abraham
mark.j.abraham at gmail.com
Mon Jun 2 08:59:52 CEST 2014
On Mon, Jun 2, 2014 at 4:33 AM, Chetan Mahajan <chetanvm10 at gmail.com> wrote:
> Hi,
>
> Unfortunately, there is no scope I can try smaller version of the system.
>
That seems highly unlikely. You could eliminate water. You could cut out a
small chunk of TiO2 and put whatever you're using to terminate the lattice
on the outside. You need to validate both your implementation of mixed VDW
interactions, and GROMACS implementation of it.
> However, let me explain one lead that I indicated earlier. Coulomb energy
> at the end of minimization is -5E-7, which becomes "-nan" at the zeroth
> step of NPT molecular dynamics run after minimization and thus code
> crashes. Question is what is it that is giving two different energy values
> (-5E-7 and -nan) for the same atomic configurations ( end of minimization,
> beginning of molecular dynamics)? I removed barostat, thermostat, h-bond
> constraints from the MD code, so as to come as close to minimization run as
> possible. However, coulomb energy in MD code is still blowing up. This is
> very interesting since as I said we have two points with different energy,
> but same atomic configuration, any further leads?
>
Can't tell without an .mdp file, but it's not worth talking about while the
system is not neutral. Then, run it with all LJ, without tables. Then all
LJ with tables. Then add your Buckingham tables.
(total charge is -0.018 in both minimization as well as MD code, so that's
> not affecting, water is flexible during minimization and not so during MD.
> However, that should not matter, since code is crashing very initial step
> of MD run).
>
That could mean anything, including that your MPI compilation of GROMACS is
not working. Simplify!
Mark
> Thanks
> Chetan
>
>
> On Sun, Jun 1, 2014 at 5:11 PM, Mark Abraham <mark.j.abraham at gmail.com>
> wrote:
>
> > Hi,
> >
> > What's the smallest relevant neutral system you can think of? Test your
> > implementation of your model on that, on a local machine, not a
> > supercomputer ;-) If there's any diagnosis of why the crash happens, it
> > will be in the log file. But you seem like you have far too many
> variables
> > to know where the major problem lies.
> >
> > Mark
> >
> >
> > On Sun, Jun 1, 2014 at 11:55 PM, Chetan Mahajan <chetanvm10 at gmail.com>
> > wrote:
> >
> > > More information:
> > >
> > > As the log file shows, coulomb interaction energy is shooting to nan
> > during
> > > equilibration where error occurs. Coulomb interaction energy during
> > > minimization is finite.
> > >
> > > Thanks
> > > Chetan
> > >
> > >
> > > On Sun, Jun 1, 2014 at 4:44 PM, Chetan Mahajan <chetanvm10 at gmail.com>
> > > wrote:
> > >
> > > > Dear All,
> > > >
> > > > I am testing my gromacs code using user-specified potential functions
> > (LJ
> > > > as well as Buckingham as non-bonded potentials). *Minimization run
> with
> > > > these potentials runs fine*. However, equilibration run gives
> following
> > > > error at which I am clueless. Could anyone help? Is it because total
> > > charge
> > > > is -0.018 and not zero?
> > > >
> > > > ...........
> > > >
> > > > Non-default thread affinity set probably by the OpenMP library,
> > > > disabling internal thread affinity
> > > > starting mdrun 'tio2_formate'
> > > > 1000000 steps, 1000.0 ps.
> > > > [c418-701.stampede.tacc.utexas.edu:mpi_rank_5][error_sighandler]
> > Caught
> > > > error: Segmentation fault (signal 11)
> > > > [c418-701.stampede.tacc.utexas.edu:mpispawn_0][readline] Unexpected
> > > > End-Of-File on file descriptor 7. MPI process died?
> > > > [c418-701.stampede.tacc.utexas.edu:mpispawn_0][mtpmi_processops]
> Error
> > > > while reading PMI socket. MPI process died?
> > > > [c418-701.stampede.tacc.utexas.edu:mpispawn_0][child_handler] MPI
> > > process
> > > > (rank: 5, pid: 122921) terminated with signal 11 -> abort job
> > > > [c418-701.stampede.tacc.utexas.edu:
> > > mpirun_rsh][process_mpispawn_connection]
> > > > mpispawn_0 from node c418-701 aborted: Error while reading a PMI
> socket
> > > (4)
> > > > [c428-301.stampede.tacc.utexas.edu:mpispawn_1][read_size] Unexpected
> > > > End-Of-File on file descriptor 21. MPI process died?
> > > > [c428-301.stampede.tacc.utexas.edu:mpispawn_1][read_size] Unexpected
> > > > End-Of-File on file descriptor 21. MPI process died?
> > > > [c428-301.stampede.tacc.utexas.edu:mpispawn_1][handle_mt_peer] Error
> > > > while reading PMI socket. MPI process died?
> > > > [c428-902.stampede.tacc.utexas.edu:mpispawn_2][read_size] Unexpected
> > > > End-Of-File on file descriptor 21. MPI process died?
> > > > [c428-902.stampede.tacc.utexas.edu:mpispawn_2][read_size] Unexpected
> > > > End-Of-File on file descriptor 21. MPI process died?
> > > > [c428-902.stampede.tacc.utexas.edu:mpispawn_2][handle_mt_peer] Error
> > > > while reading PMI socket. MPI process died?
> > > > TACC: MPI job exited with code: 1
> > > >
> > > > TACC: Shutdown complete. Exiting.
> > > >
> > > >
> > > > NOTE 1 [file lnanopf3.top, line 4422]:
> > > > System has non-zero total charge: -0.018000
> > > > Total charge should normally be an integer. See
> > > > http://www.gromacs.org/Documentation/Floating_Point_Arithmetic
> > > > for discussion on how close it should be to an integer.
> > > >
> > > >
> > > >
> > > >
> > > > -------------------------------------------------------
> > > > Program grompp, VERSION 4.6.5
> > > > Source code file:
> > > > /work/01906/xzhu216/gromacs/source/gromacs-4.6.5/src/gmxlib/futil.c,
> > > line:
> > > > 593
> > > >
> > > > File input/output error:
> > > > /work/01019/cmahajan/postdoc/mapotruns/r1/eq/lnanop.gro
> > > >
> > > > ...........
> > > >
> > > > Thanks a lot!
> > > >
> > > > regards
> > > > Chetan
> > > >
> > > --
> > > Gromacs Users mailing list
> > >
> > > * Please search the archive at
> > > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> > > posting!
> > >
> > > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> > >
> > > * For (un)subscribe requests visit
> > > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> > > send a mail to gmx-users-request at gromacs.org.
> > >
> > --
> > Gromacs Users mailing list
> >
> > * Please search the archive at
> > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> > posting!
> >
> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >
> > * For (un)subscribe requests visit
> > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> > send a mail to gmx-users-request at gromacs.org.
> >
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>
More information about the gromacs.org_gmx-users
mailing list