[gmx-users] LINCS and number of nodes
Mark Abraham
mark.abraham at anu.edu.au
Wed Mar 16 02:54:09 CET 2011
On 16/03/11, Moeed <lecielll at googlemail.com> wrote:
> Dear experts,
>
> I am trying to build up a polymer in hexane system by increasing the density.
>
This seems to have been taking months. Why aren't you using genbox on your polymer starting configuration and an equilibrated box of hexane of the right density?
Mark
> After PR step, my NVT and NPT trailes failed. Initially I used to get LINCS and 1-4 warnings (even for NVT) which were not because of flawed topology file. It turned out that simulations crashed just because of using -np > 4. But still even with this -np, NPT did not work which made me to swtich to berendsen from parrinelo rahman scheme. As I approched the desired density again simulation crashed, so I used
>
>
> trjconv -s .tpr -f .trr -o frame2300.gro -dump 2300
>
> to extract one of the frames before crash I did another NVT to equilibrate.
> mpirun -np 4 mdrun_mpi -deffnm PE60-110Hex-NPT3-frame2300_md -s -o -c -g -e -x -v -pd
>
>
> after around 1 ns I get the error below ( mdp file is also included). I described above since I encountered the situation where root cause of problem was not topology and just the computational issue ( I mean -np), I am just curious if the same thing applies here. Please help me with this. Thank you in advance.
>
>
> more details: There is only one polyethylene chain ( 60 units) in 110 hexane. The chain is not convoluted and has a little extended shape, which make it not easy fit in the box.
>
> Moeed
> ===========================================
>
>
> step 449800, will finish Tue Mar 15 11:23:55 2011
> step 449900, will finish Tue Mar 15 11:23:55 2011
> [node5:09563] *** Process received signal ***
> [node5:09563] Signal: Segmentation fault (11)
> [node5:09563] Signal code: Address not mapped (1)
>
> [node5:09563] Failing at address: 0xffffffff80849dc0
> [node5:09563] [ 0] /lib64/libpthread.so.0 [0x3a2660eb10]
> [node5:09563] [ 1] mdrun_mpi [0x4f0155]
> [node5:09563] [ 2] mdrun_mpi(gmx_pme_do+0x216d) [0x4f9c1d]
>
> [node5:09563] [ 3] mdrun_mpi(do_force_lowlevel+0x21c8) [0x49c658]
> [node5:09563] [ 4] mdrun_mpi(do_force+0xc59) [0x50db19]
> [node5:09563] [ 5] mdrun_mpi(do_md+0x5623) [0x43e353]
> [node5:09563] [ 6] mdrun_mpi(mdrunner+0xa07) [0x435e07]
>
> [node5:09563] [ 7] mdrun_mpi(main+0x1269) [0x443319]
> [node5:09563] [ 8] /lib64/libc.so.6(__libc_start_main+0xf4) [0x3a25e1d994]
> [node5:09563] [ 9] mdrun_mpi [0x420449]
> [node5:09563] *** End of error message ***
>
> --------------------------------------------------------------------------
> mpirun noticed that process rank 0 with PID 9563 on node node5.reyclus.loc exited on signal 11 (Segmentation fault).
> -----------------------------------------------------------------------
>
>
>
>
> pbc = xyz
> ;energygrps = PE HEX
>
> ; Run control
> integrator = md
> dt = 0.002
>
> nsteps = 1000000 ;5000
> nstcomm = 100
>
> ; Output control
> nstenergy = 100
> nstxout = 100
> nstvout = 0
>
> nstfout = 0
> nstlog = 1000
> nstxtcout = 1000
>
> ; Neighbor searching
> nstlist = 10
> ns_type = grid
>
>
> ; Electrostatics/VdW
> coulombtype = PME
> vdw-type = Shift
> rcoulomb-switch = 0
> rvdw-switch = 0.9 ;0
>
>
> ; Cut-offs
> rlist = 1.25
> rcoulomb = 1.25 ;1.1
> rvdw = 1.0
>
> ; PME parameters
> fourierspacing = 0.12
>
> fourier_nx = 0
> fourier_ny = 0
> fourier_nz = 0
> pme_order = 4
> ewald_rtol = 1e-5
> optimize_fft = yes
>
> ; Temperature coupling
>
> Tcoupl = v-rescale
> tc-grps = System ;HEX
> tau_t = 0.1 ;0.1
> ref_t = 300 ;300
>
> ; Pressure coupling
>
> Pcoupl = no;berendsen
> Pcoupltype = isotropic
> tau_p = 0.5 ;0.5
> compressibility = 4.5e-5 4.5e-5
> ref_p = 30 30
>
>
> ; Velocity generation
> gen_vel = yes
> gen_temp = 300.0
> gen_seed = 173529
>
> ; Bonds
> constraints = all-bonds
>
> constraint-algorithm = lincs
>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20110316/b43151fc/attachment.html>
More information about the gromacs.org_gmx-users
mailing list