[gmx-users] gromacs on glacier

Justin A. Lemkul jalemkul at vt.edu
Wed Jul 15 02:45:31 CEST 2009



Payman Pirzadeh wrote:
> Hi Justin,
> Regarding your suggestion using spc216, when I tried the energetically
> minimize spc216 with grompp,I got the following error:
> 
>  Program grompp, VERSION 4.0.4
> Source code file: topio.c, line: 415
> 
> Fatal error:
> Syntax error - File spce.itp, line 1
> Last line read:
> '[ moleculetype ]'
> Invalid order for directive moleculetype
> 
> Same thing happened when I tried TIP4P. Where is the problem? I did not have
> this issue with my own model.

Something is malformed in your .top, but it's impossible to comment any further 
unless you post your entire .top file.

-Justin

> Regards,
> 
> Payman
> 
> -----Original Message-----
> From: gmx-users-bounces at gromacs.org [mailto:gmx-users-bounces at gromacs.org]
> On Behalf Of Justin A. Lemkul
> Sent: June 8, 2009 1:53 PM
> To: Gromacs Users' List
> Subject: Re: [gmx-users] gromacs on glacier
> 
> 
> 
> Payman Pirzadeh wrote:
>> Dear Justin,
>> Here is the mpich version:
>> MPICH Version:          1.2.7p1
>> MPICH Release date:     $Date: 2005/11/04 11:54:51$
>> MPICH Patches applied:  none
>> MPICH configure:        -prefix=/share/apps/intel/mpich/
>> MPICH Device:           ch_p4
>>
> 
> Well, I think that MPICH is your problem then.  There are several reports of
> 
> sporadic bugs (the one that you're seeing!) with that version.  Since it's 
> several years old, getting a fix is probably a bit unlikely :)
> 
> Perhaps you can sort out with the sysadmins what you can do.  Like I said 
> before, probably OpenMPI is a better bet - we've never had a problem with
> it. 
> You can probably install it yourself in your home directory, and point to
> mpicc 
> with environment variables during the Gromacs installation.
> 
>> Also the gcc compiler:
>> gcc (GCC) 3.4.6 20060404 (Red Hat 3.4.6-9)
>> Copyright (C) 2006 Free Software Foundation, Inc.
>>
>> About the system, I have water model with 3 atom sites and 3 virtual
> sites.
> 
> Probably not a very good test case for diagnosing problems, but I think it's
> 
> unrelated in this case.  When testing, keep it simple - run a simulation
> with 
> spc216.gro from the Gromacs installation, it should pretty much always work
> :)
> 
> -Justin
> 
>> Payman
>>
>> -----Original Message-----
>> From: gmx-users-bounces at gromacs.org [mailto:gmx-users-bounces at gromacs.org]
>> On Behalf Of Justin A. Lemkul
>> Sent: June 8, 2009 11:56 AM
>> To: Gromacs Users' List
>> Subject: Re: [gmx-users] gromacs on glacier
>>
>>
>>
>> Payman Pirzadeh wrote:
>>> Hi Justin,
>>> Since the manual itself was not sufficient, I asked some other people who
>>> are running GROMACS in our group (but they run only on 2 CPUs). Here are
>> the
>>> steps I took to compile the parallel version (I have included my notes
>> that
>>> they told me as well):
>>>
>>> Installation
>>> 1.	./configure --prefix=/global/home/pirzadeh/gromacs-4.0.4
>>> This line specifies the installation path
>>>
>>> 2.	Make
>>> 3.	Make install
>>> These two commands will make 'grompp' and analysis functions of GROMACS.
>>>
>>> 4.	Make clean
>>> This command will clean some files generated during installation which
> are
>>> not needed anymore.
>>>
>>> 5.	./configure --enable-mpi --disable-nice
>>> --prefix=/global/home/pirzadeh/gromacs-4.0.4
>>> Here we compile the code for parallel version of GROMACS
>>>
>>> 6.	make mdrun
>>> 7.	 make install-mdrun
>>> Now the parallel version of mdrun is built. The analysis functions are
>> found
>>> in 'bin' folder accompanied by 'GMXRC'.
>>>
>>> 8.	Before running 'grompp' to produce the topology file for simulation,
>>> we should use the command  source
>>> /global/home/pirzadeh/gromacs-4.0.4/bin/GMXRC to specify the path for the
>>> current code.
>>>
>> This is pretty much the standard installation procedure.  What we also
> need
>> to 
>> know are the compilers used, etc. for the installation.  The error you're 
>> getting is from MPICH.  Which version is installed on the cluster?  It may
>> be 
>> old and buggy.  In any case, you can try to install something newer, like
> a 
>> recent version of OpenMPI (which we have on our cluster); it may be more 
>> reliable.  Only a random thought, means nothing unless we know what you
> have
>> installed :)
>>
>> What about the contents of your system, as I asked before?  Do you really
>> have a 
>> 100% virtual site system?
>>
>> -Justin
>>
>>> Sorry for tons of e-mails.
>>>
>>> Payman
>>>
>>> -----Original Message-----
>>> From: gmx-users-bounces at gromacs.org
> [mailto:gmx-users-bounces at gromacs.org]
>>> On Behalf Of Justin A. Lemkul
>>> Sent: June 8, 2009 11:12 AM
>>> To: Discussion list for GROMACS users
>>> Subject: Re: [gmx-users] gromacs on glacier
>>>
>>>
>>>
>>> Payman Pirzadeh wrote:
>>>> Hi,
>>>>
>>>> I had the chance to run the GROMACS 4.0.4 on another cluster. Same 
>>>> problem still persists. But what I found is that it can be run on a node
> 
>>>> with 2 CPUs, but as soon as the number of nodes are increased to 2, 3, .
> 
>>>> it will crash. Following are the last lines reported in different files:
>>>>
>>>> "In the log file of the code":
>>>>
>>>>  
>>>>
>>>> There are: 1611 Atoms
>>>>
>>>> There are: 1611 VSites
>>> All of your atoms are virtual sites?  If so, I would try a simpler test
>>> case, to 
>>> rule out stumbling across some obscure bug.
>>>
>>> Also:
>>>
>>>> p2_22627:  p4_error: Timeout in establishing connection to remote
>> process:
>>> 0
>>>
>>> This is an error message from MPICH, not Gromacs.  See, for example:
>>>
>>> http://www.mail-archive.com/gmx-users@gromacs.org/msg10968.html
>>>
>>> <snip>
>>>
>>>> To me, it seems that code can not communicate through more than one 
>>>> node. I am suspicious of doing sth wrong during installation! I tried 
>>>> wiki, but I can not find the documents as before, and I eally do not 
>>>> know in which step I might have gone wrong.
>>>>
>>> If you suspect you have done something wrong, then post the details of
> the
>>> system configuration (hardware, compilers, OS, etc.) as well as a
>>> step-by-step 
>>> record of what you did to compile the software.  If your procedure is
>> sound,
>>> then it helps rule out the possibility that you messed something up.
>>>
>>> -Justin
>>>
>>>>  
>>>>
>>>> Payman
>>>>
>>>>  
>>>>
>>>>
>>>> ------------------------------------------------------------------------
>>>>
>>>> _______________________________________________
>>>> gmx-users mailing list    gmx-users at gromacs.org
>>>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>>>> Please search the archive at http://www.gromacs.org/search before
>> posting!
>>>> Please don't post (un)subscribe requests to the list. Use the 
>>>> www interface or send it to gmx-users-request at gromacs.org.
>>>> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
> 

-- 
========================================

Justin A. Lemkul
Ph.D. Candidate
ICTAS Doctoral Scholar
Department of Biochemistry
Virginia Tech
Blacksburg, VA
jalemkul[at]vt.edu | (540) 231-9080
http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin

========================================



More information about the gromacs.org_gmx-users mailing list