[gmx-users] g_sans_mpi problem: bus error (core dumped)

Mark Abraham mark.j.abraham at gmail.com
Tue Mar 24 12:07:07 CET 2015


Hi,

Does the erroneous behaviour depend on the number of frames read, or the
number of frames analyzed?

Mark

On Tue, Mar 24, 2015 at 11:21 AM, CHEN Pan <evan.pan.chen at gmail.com> wrote:

> Dear all,
>
> I am using g_sans_mpi (4.6.4 version) to calculate a SANS curve of a system
> containing about 15000 atoms in total (water, urea, NaOH and oligomers).
> The command used are listed below.
>
> *echo 0 | g_sans_mpi -f ../md.xtc -s ../md.tpr -pr -sq -pbc -mode direct
> -xvg none -b 50000 -e 100000*
>
> The thing is that with calculation for 5 ns of MD trajectory, no error
> occurred, whereas for calculation of 50 ns trajectory. An error message
> shows " zsh: bus error (core dumped)", or shows "segmentation fault 11"
> sometime. With increasing memory or increasing number of threads (-nt)
> didn't help.
>
> I have googled for both "g_sans" and "bus error". It seems nobody has ever
> meet such problem with using g_sans. I understand the manual says that
> computational cost is going to increase when using -pr and -sq, as well as
> the increase of number of system particles. Then I tried to calculate SANS
> curves for each frame with the following command. Still, the same error
> occurred after reading over 10 ns of frames.
>
> *echo 0 | g_sans_mpi -f ../md.xtc -s ../md.tpr -prframe -sqframe -pbc -mode
> direct -xvg none -b 50000 -e 100000*
>
> Welcome for any comments and potential explanations.
>
> Best,
> Pan
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>


More information about the gromacs.org_gmx-users mailing list