[gmx-users] help about "Inconsistent DD boundary staggering limits"

Desheng Zheng dzheng at bgsu.edu
Thu Apr 26 21:30:27 CEST 2012


Thanks Justin!

about the "Software inconsistency error: Inconsistent DD boundary staggering limits!"

I still have three concerts.

1. Is it ok, if i use grompp to generate the edr file in gromacs 4.5.5 environment    with the gro file and top file which were builed under Gromacs 4.0.7 ?

2. In my protein-DPPC lipid membrane, I use the electric filed Ez is  0.3V/nm. whether is the value too high to induce the bowling up?

3. why do the generated  edr file in gromacs 4.5.5 environment larger than the edr file in gromacs 4.0.7 environment, even with the same gro file and top file and the same commands?

Best wishes,

Desheng




________________________________________
From: gmx-users-bounces at gromacs.org [gmx-users-bounces at gromacs.org] On Behalf Of Justin A. Lemkul [jalemkul at vt.edu]
Sent: Thursday, April 26, 2012 3:05 PM
To: Discussion list for GROMACS users
Subject: Re: [gmx-users] help about "Inconsistent DD boundary staggering        limits"

On 4/26/12 2:52 PM, Desheng Zheng wrote:
> Hi Guys,
>
> I have done the simulation. The total steps is 5000000. At around 900000 steps, the error information appear like the followings.
>
> Please give me some suggestions to fix it.
>

Based on the comment that precedes the error call in the code:

/* Make sure that the grid is not shifted too much */

I would assume that this means your system has simply become unstable and is
blowing up.

http://www.gromacs.org/Documentation/Terminology/Blowing_Up

-Justin

> Best wishes,
>
> Desheng
>
> ------------------------------------------------------
> Program mdrun, VERSION 4.5.5
> Source code file: domdec.c, line: 3266
>
> Software inconsistency error:
> Inconsistent DD boundary staggering limits!
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
> -------------------------------------------------------
>
> "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
>
> -------------------------------------------------------
> Program mdrun, VERSION 4.5.5
> Source code file: domdec.c, line: 3266
>
> Software inconsistency error:
> Inconsistent DD boundary staggering limits!
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
> -------------------------------------------------------
>
> "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> Error on node 102, will try to stop all the nodes
> Halting parallel program mdrun on CPU 102 out of 120
> Error on node 105, will try to stop all the nodes
> Halting parallel program mdrun on CPU 105 out of 120
> Error on node 51, will try to stop all the nodes
> Halting parallel program mdrun on CPU 51 out of 120
>
> -------------------------------------------------------
> Program mdrun, VERSION 4.5.5
> Source code file: domdec.c, line: 3266
>
> Software inconsistency error:
> Inconsistent DD boundary staggering limits!
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
> -------------------------------------------------------
>
> "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> Error on node 81, will try to stop all the nodes
> Halting parallel program mdrun on CPU 81 out of 120
>
> -------------------------------------------------------
> Program mdrun, VERSION 4.5.5
> Source code file: domdec.c, line: 3266
>
> Software inconsistency error:
> Inconsistent DD boundary staggering limits!
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
> -------------------------------------------------------
>
> "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> Error on node 63, will try to stop all the nodes
> Halting parallel program mdrun on CPU 63 out of 120
>
> -------------------------------------------------------
> Program mdrun, VERSION 4.5.5
> Source code file: domdec.c, line: 3266
>
> Software inconsistency error:
> Inconsistent DD boundary staggering limits!
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
> -------------------------------------------------------
>
> "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> Error on node 33, will try to stop all the nodes
> Halting parallel program mdrun on CPU 33 out of 120
>
> -------------------------------------------------------
> Program mdrun, VERSION 4.5.5
> Source code file: domdec.c, line: 3266
>
> Software inconsistency error:
> Inconsistent DD boundary staggering limits!
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
> -------------------------------------------------------
>
> "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
>
> -------------------------------------------------------
> Program mdrun, VERSION 4.5.5
> Source code file: domdec.c, line: 3266
>
> Software inconsistency error:
> Inconsistent DD boundary staggering limits!
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
> -------------------------------------------------------
>
> "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> Error on node 54, will try to stop all the nodes
> Halting parallel program mdrun on CPU 54 out of 120
> Error on node 24, will try to stop all the nodes
> Halting parallel program mdrun on CPU 24 out of 120
>
> -------------------------------------------------------
> Program mdrun, VERSION 4.5.5
> Source code file: domdec.c, line: 3266
>
> Software inconsistency error:
> Inconsistent DD boundary staggering limits!
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
> -------------------------------------------------------
>
> "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> Error on node 84, will try to stop all the nodes
> Halting parallel program mdrun on CPU 84 out of 120
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 42
>
> -------------------------------------------------------
> Program mdrun, VERSION 4.5.5
> Source code file: domdec.c, line: 3266
>
> Software inconsistency error:
> Inconsistent DD boundary staggering limits!
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
> -------------------------------------------------------
>
> "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> Error on node 36, will try to stop all the nodes
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 21
> Halting parallel program mdrun on CPU 36 out of 120
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 114
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 66
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
>
> -------------------------------------------------------
>
> -------------------------------------------------------
>
> "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 45
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> Error on node 39, will try to stop all the nodes
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 18
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> Halting parallel program mdrun on CPU 39 out of 120
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 93
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 117
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 69
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 90
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 75
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 108
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 30
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 72
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 15
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 99
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 87
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 78
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 57
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 96
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 27
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 48
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 60
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 81
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 24
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 102
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 33
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 51
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 36
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 84
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 54
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 39
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 9
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 12
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 105
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 111
>
> gcq#359: "If it weren't for bad luck, we'd have no luck at all" (The Unthanks)
>
> application called MPI_Abort(MPI_COMM_WORLD, -1) - process 63
> srun: error: cu08n87: tasks 0,3,6: Exited with exit code 255
> srun: error: cu08n124: tasks 18,21: Exited with exit code 255
> srun: error: cu09n191: tasks 66,69: Exited with exit code 255
> srun: error: cu11n188: tasks 114,117: Exited with exit code 255
> srun: error: cu11n185: tasks 90,93: Exited with exit code 255
> srun: error: cu09n141: tasks 42,45: Exited with exit code 255
> srun: error: cu09n192: tasks 72,75,78: Exited with exit code 255
> srun: error: cu11n187: tasks 105,108,111: Exited with exit code 255
> srun: error: cu11n186: tasks 96,99,102: Exited with exit code 255
> srun: error: cu08n125: tasks 24,27,30: Exited with exit code 255
> srun: error: cu08n110: tasks 9,12,15: Exited with exit code 255
> srun: error: cu08n126: tasks 33,36,39: Exited with exit code 255
> srun: error: cu09n142: tasks 48,51,54: Exited with exit code 255
> srun: error: cu11n184: tasks 81,84,87: Exited with exit code 255
> srun: error: cu09n164: tasks 57,60,63: Exited with exit code 255
>                                                                10946,1

--
========================================

Justin A. Lemkul
Ph.D. Candidate
ICTAS Doctoral Scholar
MILES-IGERT Trainee
Department of Biochemistry
Virginia Tech
Blacksburg, VA
jalemkul[at]vt.edu | (540) 231-9080
http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin

========================================
--
gmx-users mailing list    gmx-users at gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-request at gromacs.org.
Can't post? Read http://www.gromacs.org/Support/Mailing_Lists




More information about the gromacs.org_gmx-users mailing list