[gmx-users] Re: Failed to lock: pre.log (Gromacs 4.5.3)

Roland Schulz roland at utk.edu
Fri Nov 26 17:41:02 CET 2010


Hi Baofu,

could you provide more information about the file system?
The command "mount" provides the file system used. If it is a
network-file-system than the operating system and file system used on the
file server is also of interest.

Roland

On Fri, Nov 26, 2010 at 11:00 AM, Baofu Qiao <qiaobf at gmail.com> wrote:

> Hi Roland,
>
> Thanks a lot!
>
> OS: Scientific Linux 5.5. But the system to store data is called as
> WORKSPACE, different from the regular hardware system. Maybe this is the
> reason.
>
> I'll try what you suggest!
>
> regards,
> Baofu Qiao
>
>
> On 11/26/2010 04:07 PM, Roland Schulz wrote:
> > Baofu,
> >
> > what operating system are you using? On what file system do you try to
> store
> > the log file? The error (should) mean that the file system you use
> doesn't
> > support locking of files.
> > Try to store the log file on some other file system. If you want you can
> > still store the (large) trajectory files on the same file system.
> >
> > Roland
> >
> > On Fri, Nov 26, 2010 at 4:55 AM, Baofu Qiao <qiaobf at gmail.com> wrote:
> >
> >
> >> Hi Carsten,
> >>
> >> Thanks for your suggestion! But because my simulation will be run for
> >> about 200ns, 10ns per day(24 hours is the maximum duration for one
> >> single job on the Cluster I am using), which will generate about 20
> >> trajectories!
> >>
> >> Can anyone find the reason causing such error?
> >>
> >> regards,
> >> Baofu Qiao
> >>
> >>
> >> On 11/26/2010 09:07 AM, Carsten Kutzner wrote:
> >>
> >>> Hi,
> >>>
> >>> as a workaround you could run with -noappend and later
> >>> concatenate the output files. Then you should have no
> >>> problems with locking.
> >>>
> >>> Carsten
> >>>
> >>>
> >>> On Nov 25, 2010, at 9:43 PM, Baofu Qiao wrote:
> >>>
> >>>
> >>>
> >>>> Hi all,
> >>>>
> >>>> I just recompiled GMX4.0.7. Such error doesn't occur. But 4.0.7 is
> about
> >>>>
> >> 30% slower than 4.5.3. So I really appreciate if anyone can help me with
> it!
> >>
> >>>> best regards,
> >>>> Baofu Qiao
> >>>>
> >>>>
> >>>> 于 2010-11-25 20:17, Baofu Qiao 写道:
> >>>>
> >>>>
> >>>>> Hi all,
> >>>>>
> >>>>> I got the error message when I am extending the simulation using the
> >>>>>
> >> following command:
> >>
> >>>>> mpiexec -np 64 mdrun -deffnm pre -npme 32 -maxh 2 -table table -cpi
> >>>>>
> >> pre.cpt -append
> >>
> >>>>> The previous simuluation is succeeded. I wonder why pre.log is
> locked,
> >>>>>
> >> and the strange warning of "Function not implemented"?
> >>
> >>>>> Any suggestion is appreciated!
> >>>>>
> >>>>> *********************************************************************
> >>>>> Getting Loaded...
> >>>>> Reading file pre.tpr, VERSION 4.5.3 (single precision)
> >>>>>
> >>>>> Reading checkpoint file pre.cpt generated: Thu Nov 25 19:43:25 2010
> >>>>>
> >>>>> -------------------------------------------------------
> >>>>> Program mdrun, VERSION 4.5.3
> >>>>> Source code file: checkpoint.c, line: 1750
> >>>>>
> >>>>> Fatal error:
> >>>>> Failed to lock: pre.log. Function not implemented.
> >>>>> For more information and tips for troubleshooting, please check the
> >>>>>
> >> GROMACS
> >>
> >>>>> website at http://www.gromacs.org/Documentation/Errors
> >>>>> -------------------------------------------------------
> >>>>>
> >>>>> "It Doesn't Have to Be Tip Top" (Pulp Fiction)
> >>>>>
> >>>>> Error on node 0, will try to stop all the nodes
> >>>>> Halting parallel program mdrun on CPU 0 out of 64
> >>>>>
> >>>>> gcq#147: "It Doesn't Have to Be Tip Top" (Pulp Fiction)
> >>>>>
> >>>>>
> >>>>>
> >>
> --------------------------------------------------------------------------
> >>
> >>>>> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> >>>>> with errorcode -1.
> >>>>>
> >>>>> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> >>>>> You may or may not see output from other processes, depending on
> >>>>> exactly when Open MPI kills them.
> >>>>>
> >>>>>
> >>
> --------------------------------------------------------------------------
> >>
> >>>>>
> >>
> --------------------------------------------------------------------------
> >>
> >>>>> mpiexec has exited due to process rank 0 with PID 32758 on
> >>>>>
> >>>>>
> >>>>>
> >>>> --
> >>>> gmx-users mailing list    gmx-users at gromacs.org
> >>>> http://lists.gromacs.org/mailman/listinfo/gmx-users
> >>>> Please search the archive at
> >>>>
> >> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> >>
> >>>> Please don't post (un)subscribe requests to the list. Use the
> >>>> www interface or send it to gmx-users-request at gromacs.org.
> >>>> Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >>>>
> >>>>
> >>>
> >>>
> >>>
> >>>
> >>>
>
> --
> gmx-users mailing list    gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
>
>


-- 
ORNL/UT Center for Molecular Biophysics cmb.ornl.gov
865-241-1537, ORNL PO BOX 2008 MS6309
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20101126/7f12bd58/attachment.html>


More information about the gromacs.org_gmx-users mailing list