[gmx-users] LINCS and number of nodes
Moeed
lecielll at googlemail.com
Sat Mar 19 02:37:59 CET 2011
Hello Justin,
Thanks for you reply. My late response is because I have been trying to
resolve the problem.
Dear experts,
>>
>> I am trying to build up a polymer in hexane system by increasing the
>> density. After PR step, my NVT and NPT trailes failed. Initially I used to
>> get LINCS and 1-4 warnings (even for NVT) which were not because of flawed
>> topology file. It turned out that simulations crashed just because of using
>> -np > 4. But still even with this -np, NPT did not work which made me to
>> swtich to berendsen from parrinelo rahman scheme. As I approched the desired
>> density again simulation crashed, so I used
>>
>> trjconv -s .tpr -f .trr -o frame2300.gro -dump 2300
>>
>> to extract one of the frames before crash I did another NVT to
>> equilibrate.
>> mpirun -np 4 mdrun_mpi -deffnm PE60-110Hex-NPT3-frame2300_md -s -o -c -g
>> -e -x -v -pd
>>
>> after around 1 ns I get the error below ( mdp file is also included). I
>> described above since I encountered the situation where root cause of
>> problem was not topology and just the computational issue ( I mean -np), I
>> am just curious if the same thing applies here. Please help me with this.
>> Thank you in advance.
>>
>>
> What MPI library (and version) are you using? Do your runs work in serial?
>
> I am using openmpi.1.4.3. Once I try mdrun in serial I get many LINCS and
1-4 warnings ending with segmentation fault at the very beginning.
Wrote pdb files with previous and current coordinates
Wrote pdb files with previous and current coordinates
Warning: 1-4 interaction between 1433 and 1441 at distance 2.472 which is
larger than the 1-4 table size 2.250 nm
These are ignored for the rest of the simulation
This usually means your system is exploding,
if not, you should increase table-extension in your mdp file
or with user tables increase the table size
Step 11, time 0.022 (ps) LINCS WARNING
relative constraint deviation after LINCS:
rms 0.004346, max 0.036836 (between atoms 116 and 119)
parallel run does not work for -np >4
> more details: There is only one polyethylene chain ( 60 units) in 110
>> hexane. The chain is not convoluted and has a little extended shape, which
>> make it not easy fit in the box.
>>
>>
> What do you mean it doesn't fit in the box? If you've got a system that
> you're trying to force into some shape or size, your PE chain is probably
> just crashing into itself across periodic boundaries. Watch the trajectory
> to see what's going on prior to the crash.
>
I am not forcing it in to the box. I am just increasing the density but
since it becomes to extended it crashes into itself as you predicted. I have
no control on this..Then I had to take another frame before crash and try a
new conformation hoping that a long simulation with this starting structure
does not crash.
Also I tried FE, but I am getting no output file dhdl. even when I include
-dhdl in the command line...
free_energy = yes
init_lambda = 0
delta_lambda = 0
sc_alpha = 0.5
sc-power = 1
sc_sigma = 0.3
foreign_lambda = 0.1
dhdl_derivatives = yes
couple-moltype = Polymer
couple-lambda0 = vdw-q ;vdw
couple-lambda1 = none ;vdw;none
couple-intramol = yes
nstdhdl = 10
separate_dhdl_file = yes
dh_hist_size = 0
dh_hist_spacing = 0.1
Please give me some advice..
Thanks
>
> -Justin
>
>
> Moeed
>> ===========================================
>>
>> step 449800, will finish Tue Mar 15 11:23:55 2011
>> step 449900, will finish Tue Mar 15 11:23:55 2011
>> [node5:09563] *** Process received signal ***
>> [node5:09563] Signal: Segmentation fault (11)
>> [node5:09563] Signal code: Address not mapped (1)
>> [node5:09563] Failing at address: 0xffffffff80849dc0
>> [node5:09563] [ 0] /lib64/libpthread.so.0 [0x3a2660eb10]
>> [node5:09563] [ 1] mdrun_mpi [0x4f0155]
>> [node5:09563] [ 2] mdrun_mpi(gmx_pme_do+0x216d) [0x4f9c1d]
>> [node5:09563] [ 3] mdrun_mpi(do_force_lowlevel+0x21c8) [0x49c658]
>> [node5:09563] [ 4] mdrun_mpi(do_force+0xc59) [0x50db19]
>> [node5:09563] [ 5] mdrun_mpi(do_md+0x5623) [0x43e353]
>> [node5:09563] [ 6] mdrun_mpi(mdrunner+0xa07) [0x435e07]
>> [node5:09563] [ 7] mdrun_mpi(main+0x1269) [0x443319]
>> [node5:09563] [ 8] /lib64/libc.so.6(__libc_start_main+0xf4) [0x3a25e1d994]
>> [node5:09563] [ 9] mdrun_mpi [0x420449]
>> [node5:09563] *** End of error message ***
>> --------------------------------------------------------------------------
>> mpirun noticed that process rank 0 with PID 9563 on node node5.reyclus.loc
>> exited on signal 11 (Segmentation fault).
>> -----------------------------------------------------------------------
>>
>>
>>
>> pbc = xyz ;energygrps = PE HEX
>> ; Run control integrator = md
>> dt = 0.002 nsteps =
>> 1000000 ;5000 nstcomm = 100
>> ; Output control
>> nstenergy = 100 nstxout = 100
>> nstvout = 0
>> nstfout = 0
>> nstlog = 1000 nstxtcout = 1000
>>
>> ; Neighbor searching
>> nstlist = 10 ns_type = grid
>>
>> ; Electrostatics/VdW
>> coulombtype = PME vdw-type = Shift
>> rcoulomb-switch = 0 rvdw-switch =
>> 0.9 ;0
>> ; Cut-offs
>> rlist = 1.25 rcoulomb = 1.25
>> ;1.1 rvdw = 1.0
>> ; PME parameters
>> fourierspacing = 0.12 fourier_nx = 0
>> fourier_ny = 0
>> fourier_nz = 0
>> pme_order = 4 ewald_rtol = 1e-5
>> optimize_fft = yes
>>
>> ; Temperature coupling Tcoupl = v-rescale
>> tc-grps = System ;HEX tau_t
>> = 0.1 ;0.1 ref_t = 300 ;300 ;
>> Pressure coupling
>> Pcoupl = no;berendsen Pcoupltype =
>> isotropic tau_p = 0.5 ;0.5
>> compressibility = 4.5e-5 4.5e-5 ref_p = 30 30
>>
>> ; Velocity generation gen_vel = yes
>> gen_temp = 300.0 gen_seed =
>> 173529
>> ; Bonds
>> constraints = all-bonds constraint-algorithm =
>> lincs
>>
>>
> --
> ========================================
>
> Justin A. Lemkul
> Ph.D. Candidate
> ICTAS Doctoral Scholar
> MILES-IGERT Trainee
> Department of Biochemistry
> Virginia Tech
> Blacksburg, VA
> jalemkul[at]vt.edu | (540) 231-9080
> http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin
>
> ========================================
> --
> gmx-users mailing list gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> Please don't post (un)subscribe requests to the list. Use the www interface
> or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20110318/fb07c107/attachment.html>
More information about the gromacs.org_gmx-users
mailing list