[gmx-users] " X particles communicated to PME node Y are more ... " error in lysozyme tutorial
Mark Abraham
mark.j.abraham at gmail.com
Tue May 21 14:41:03 CEST 2013
I've opened http://redmine.gromacs.org/issues/1259 to discuss this issue.
All users are warned that such issues are possible, and if you don't run
the regression test suite, you can be at the mercy of a buggy compiler!
Mark
On Tue, May 21, 2013 at 2:35 PM, Mark Abraham <mark.j.abraham at gmail.com>wrote:
> http://gcc.gnu.org/bugzilla/show_bug.cgi?id=49002 looks like it might be
> the issue - fixed between gcc 4.6.1 and 4.6.2. If Vishal would like to
> update his gcc to the latest version (which we always recommend, because
> such issues are distressingly frequent), then he should be able to take
> advantage of the correct and faster AVX_256. Please let us know how it
> goes, Vishal. If that fixes the issue then I will add logic to refuse to
> build AVX_256 with 4.6.1.
>
> Mark
>
>
> On Mon, May 20, 2013 at 3:00 PM, Vishal Kumar Jaiswal <
> vishal3990 at gmail.com> wrote:
>
>> LAPTOP- WORKING
>> PC -NOT WORKING
>>
>> Gromacs version: VERSION 4.6.1
>> VERSION 4.6.1
>> Precision: single
>> same
>> Memory model: 32 bit
>> 64 bit
>> MPI library: thread_mpi
>> same
>> OpenMP support: enabled
>> same
>> GPU support: enabled
>> disabled
>> invsqrt routine: gmx_software_invsqrt(x)
>> same
>> CPU acceleration: SSE4.1
>> AVX_256
>> FFT library: fftw-3.2.2
>> fftw-3.3.3-sse2
>> Large file support: enabled
>> same
>> RDTSCP usage: disabled
>> enabled
>> Built on: Fri Mar 15 14:30:08 IST 2013
>> Fri May 10 13:45:48 IST 2013
>> Built by: vishal at vishal-VPCCW16FG [CMAKE]
>> vishal at aditya-HCL-Desktop [CMAKE]
>> Build OS/arch: Linux 3.0.0-12-generic i686
>> Linux 3.0.0-19-generic x86_64
>> Build CPU vendor: GenuineIntel
>> GenuineIntel
>>
>>
>>
>> Build CPU brand: Intel(R) Core(TM)2 Duo CPU P8700 @ 2.53GHz
>> Intel(R) Core(TM) i5-2400 CPU @ 3.10GHz
>> Build CPU family: 6 Model: 23 Stepping: 10
>> 6 Model: 42 Stepping: 7
>>
>> Build CPU features: apic clfsh cmov cx8 cx16 lahf_lm mmx msr pdcm pse sse2
>> sse3 sse4.1 ssse3
>> : PC - aes apic avx clfsh cmov cx8 cx16 htt
>> lahf_lm mmx msr nonstop_tsc pcid pclmuldq pdcm popcnt pse rdtscp sse2 sse3
>> sse4.1 sse4.2 ssse3 tdt x2apic
>>
>> C compiler: /usr/bin/gcc GNU gcc (Ubuntu/Linaro 4.6.1-9ubuntu3)
>> 4.6.1
>> PC - /usr/bin/gcc GNU gcc (Ubuntu/Linaro
>> 4.6.1-9ubuntu3) 4.6.1
>>
>> C compiler flags: -msse4.1 -Wextra -Wno-missing-field-initializers
>> -Wno-sign-compare -Wall -Wno-unused -Wunused-value -fomit-frame-pointer
>> -funroll-all-loops -fexcess-precision=fast -O3 -DNDEBUG
>> PC - -mavx -Wextra
>> -Wno-missing-field-initializers
>> -Wno-sign-compare -Wall -Wno-unused -Wunused-value -fomit-frame-pointer
>> -funroll-all-loops -fexcess-precision=fast -O3 -DNDEBUG
>>
>>
>> C++ compiler: /usr/bin/c++ GNU c++ (Ubuntu/Linaro 4.6.1-9ubuntu3)
>> 4.6.1
>> PC - none
>>
>> C++ compiler flags: -msse4.1 -Wextra -Wno-missing-field-initializers
>> -Wno-sign-compare -Wall -Wno-unused -Wunused-value -fomit-frame-pointer
>> -funroll-all-loops -fexcess-precision=fast -O3 -DNDEBUG
>> PC -none
>>
>> CUDA compiler: nvcc: NVIDIA (R) Cuda compiler driver;Copyright (c)
>> 2005-2012 NVIDIA Corporation;Built on Fri_Sep_21_17:24:42_PDT_2012;Cuda
>> compilation tools, release 5.0, V0.2.1221
>> PC (none)
>>
>> CUDA driver: 5.0 none
>> CUDA runtime: 5.0 none
>>
>>
>> These are taken from gromacs log files. ( Though i have CUDA installed on
>> laptop i dont use it for GROMACS as the NVIDIA compte capability is 1.2)
>> I am able to run the tutorial on PC now using GROMACS 4.5.7
>>
>> Please tell me if you want more details.
>>
>>
>> On Mon, May 20, 2013 at 5:46 PM, Justin Lemkul <jalemkul at vt.edu> wrote:
>>
>> >
>> >
>> > On 5/20/13 2:29 AM, Vishal Kumar Jaiswal wrote:
>> >
>> >> I am trying to complete the lysozyme in water tutorial in gromacs 4.6.1
>> >> and
>> >> i am getting this error in the energy minimization step
>> >> I was able to complete the entire tutorial on another computer ( my
>> >> personal laptop) . ( I was getting the same error while doing energy
>> >> minimization on another system (polymer in water) which forced me to
>> check
>> >> gromacs correcctness by doing the lysozyme tutorial)
>> >>
>> >>
>> > How do the configurations of your laptop and the problematic machine
>> > differ?
>> >
>> > -Justin
>> >
>> >
>> >
>> >> The following is the verbose output to my terminal :
>> >>
>> >> Steepest Descents:
>> >> Tolerance (Fmax) = 1.00000e+03
>> >> Number of steps = 50000
>> >> Step= 0, Dmax= 1.0e-02 nm, Epot= -nan Fmax= 1.50489e+13,
>> atom=
>> >> 14918
>> >> ------------------------------**-------------------------
>> >> Program mdrun, VERSION 4.6.1
>> >> Source code file:
>> /home/vishal/Downloads/**gromacs-4.6.1/src/mdlib/pme.c,
>> >> line: 827
>> >>
>> >> Fatal error:
>> >> 3483 particles communicated to PME node 2 are more than 2/3 times the
>> >> cut-off out of the domain decomposition cell of their charge group in
>> >> dimension x.
>> >> This usually means that your system is not well equilibrated.
>> >>
>> >> Following is the entry from "em.log" file
>> >>
>> >> Linking all bonded interactions to atoms
>> >> There are 8522 inter charge-group exclusions,
>> >> will use an extra communication step for exclusion forces for PME
>> >>
>> >> The initial number of communication pulses is: X 1
>> >> The initial domain decomposition cell size is: X 1.83 nm
>> >>
>> >> The maximum allowed distance for charge groups involved in interactions
>> >> is:
>> >> non-bonded interactions 1.000 nm
>> >> two-body bonded interactions (-rdd) 1.000 nm
>> >> multi-body bonded interactions (-rdd) 1.000 nm
>> >>
>> >>
>> >> Making 1D domain decomposition grid 4 x 1 x 1, home cell index 0 0 0
>> >>
>> >> Initiating Steepest Descents
>> >> Charge group distribution at step 0: 3184 3300 3308 3255
>> >> Grid: 5 x 11 x 11 cells
>> >> Started Steepest Descents on node 0 Mon May 20 11:36:21 2013
>> >>
>> >> Steepest Descents:
>> >> Tolerance (Fmax) = 1.00000e+03
>> >> Number of steps = 50000
>> >>
>> >> Step Time Lambda
>> >> 0 0.00000 0.00000
>> >>
>> >> DD step 0 load imb.: force 8.8%
>> >>
>> >>
>> >> ------------------------------**-------------------------
>> >> Program mdrun, VERSION 4.6.1
>> >> Source code file:
>> /home/vishal/Downloads/**gromacs-4.6.1/src/mdlib/pme.c,
>> >> line: 827
>> >>
>> >> Fatal error:
>> >> 3483 particles communicated to PME node 2 are more than 2/3 times the
>> >> cut-off out of the domain decomposition cell of their charge group in
>> >> dimension x.
>> >> This usually means that your system is not well equilibrated.
>> >>
>> >>
>> >> The following is the hardware information of the pc on which error is
>> >> showing ttaken from "em.log" file
>> >>
>> >>
>> >> Gromacs version: VERSION 4.6.1
>> >> Precision: single
>> >> Memory model: 64 bit
>> >> MPI library: thread_mpi
>> >> OpenMP support: enabled
>> >> GPU support: disabled
>> >> invsqrt routine: gmx_software_invsqrt(x)
>> >> CPU acceleration: AVX_256
>> >> FFT library: fftw-3.3.3-sse2
>> >> Large file support: enabled
>> >> RDTSCP usage: enabled
>> >> Built on: Fri May 10 13:45:48 IST 2013
>> >> Built by: vishal at aditya-HCL-Desktop [CMAKE]
>> >> Build OS/arch: Linux 3.0.0-19-generic x86_64
>> >> Build CPU vendor: GenuineIntel
>> >> Build CPU brand: Intel(R) Core(TM) i5-2400 CPU @ 3.10GHz
>> >> Build CPU family: 6 Model: 42 Stepping: 7
>> >> Build CPU features: aes apic avx clfsh cmov cx8 cx16 htt lahf_lm mmx
>> msr
>> >> nonstop_tsc pcid pclmuldq pdcm popcnt pse rdtscp sse2 sse3 sse4.1
>> sse4.2
>> >> ssse3 tdt x2apic
>> >> C compiler: /usr/bin/gcc GNU gcc (Ubuntu/Linaro 4.6.1-9ubuntu3)
>> >> 4.6.1
>> >> C compiler flags: -mavx -Wextra -Wno-missing-field-**initializers
>> >> -Wno-sign-compare -Wall -Wno-unused -Wunused-value
>> -fomit-frame-pointer
>> >> -funroll-all-loops -fexcess-precision=fast -O3 -DNDEBUG
>> >>
>> >>
>> > --
>> > ==============================**==========
>> >
>> > Justin A. Lemkul, Ph.D.
>> > Research Scientist
>> > Department of Biochemistry
>> > Virginia Tech
>> > Blacksburg, VA
>> > jalemkul[at]vt.edu | (540) 231-9080
>> > http://www.bevanlab.biochem.**vt.edu/Pages/Personal/justin<
>> http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin>
>> >
>> > ==============================**==========
>> > --
>> > gmx-users mailing list gmx-users at gromacs.org
>> > http://lists.gromacs.org/**mailman/listinfo/gmx-users<
>> http://lists.gromacs.org/mailman/listinfo/gmx-users>
>> > * Please search the archive at http://www.gromacs.org/**
>> > Support/Mailing_Lists/Search<
>> http://www.gromacs.org/Support/Mailing_Lists/Search>before posting!
>> > * Please don't post (un)subscribe requests to the list. Use the www
>> > interface or send it to gmx-users-request at gromacs.org.
>> > * Can't post? Read http://www.gromacs.org/**Support/Mailing_Lists<
>> http://www.gromacs.org/Support/Mailing_Lists>
>> >
>> --
>> gmx-users mailing list gmx-users at gromacs.org
>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>> * Please search the archive at
>> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>> * Please don't post (un)subscribe requests to the list. Use the
>> www interface or send it to gmx-users-request at gromacs.org.
>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
>
>
More information about the gromacs.org_gmx-users
mailing list