[gmx-users] Reverse micelle clustering issue

ABEL Stephane 175950 Stephane.ABEL at cea.fr
Wed Jul 22 10:57:44 CEST 2015


Hello

To center your RM inside teh box you could use to successive trjconv commands with pbc cluster and mol

in index.ndx

0 : all  
1 AOT 
2 water
3 AOT_Water
4 ISO

1 -- select  1 1 0 (or 2 2 0) 

echo 1 1 0 | trjconv -f my_initial.xtc -s my_tpr.tpr -pbc cluster -ur compact -center -o my_clustered,xtc 

2 - 2 0 (or 1 0) 

echo 1 0  | trjconv -f my_clustered,xtc -s my_tpr.tpr -pbc mol  -ur compact -center -o my_all_centered,xtc 

With these commands, the RM should be inside  the box 

BTW; 

Your box is a bit too small for me, What is the isooctane mass fraction ?

HTH

Stephane

------------------------------

Message: 3
Date: Tue, 21 Jul 2015 17:38:30 -0300
From: "V.V.Chaban" <vvchaban at gmail.com>
To: gmx-users <gmx-users at gromacs.org>
Subject: Re: [gmx-users] Reverse micelle clustering issue
Message-ID:
        <CAPXdD+aVsj3pkS0+cSYCatGP1TSod9B-rqjXbETk+aytsJJgNA at mail.gmail.com>
Content-Type: text/plain; charset=UTF-8

The general advice to use trjconv  with every keyword independently.

In certain cases, its behavior is fairly bizarre if one uses all
keywords at the same time. Probably, order of called procedures in the
code matters...


Professor Vitaly V. Chaban




On Tue, Jul 21, 2015 at 3:08 PM, Tyler Cropley <tyler.cropley at wagner.edu> wrote:
> Dear Gromacs users,
>
>
>  We ran a simulation of an AOT reverse micelle in isooctane solvent. We
> followed instructions
> http://www.gromacs.org/Documentation/How-tos/Micelle_Clustering . For
> clustering we selected the group that contains AOT and for output we
> selected the entire system. The reverse micelle appears outside of the
> solvent box in vmd. This is the picture
> http://s24.photobucket.com/user/tycropley/media/RM-issue_zpsbngarszw.png.html
> . We used -center and -boxcenter for trjconv but it did not work.
>
>
>  Is there a way to ensure that the corrected trajectory displays the
> reverse micelle inside the solvent box? Or is there something more
> seriously wrong?
>
>
>  Thank you,
>
> Tyler
> --
> Gromacs Users mailing list
>
> * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-request at gromacs.org.


------------------------------

Message: 4
Date: Tue, 21 Jul 2015 17:16:41 -0500
From: Krzysztof Kuczera <kkuczera at ku.edu>
To: <gmx-users at gromacs.org>
Subject: Re: [gmx-users] GROMACS 5.0.5 GPU version on K620
Message-ID: <55AEC4C9.5070206 at ku.edu>
Content-Type: text/plain; charset="utf-8"; format=flowed

Hi Szil?rd ,

The test case worked after I added the flags you recommended, though the
speed was not as high as I had hoped - equivalent to about 6 CPU cores -
I guess my GPU is not so hot.

Below is the output from "mdrun -version".  Please let me know if you
see things that might be optimized there.

Thanks for your excellent help!

Krzysztof
-------------------------------
Gromacs version:    VERSION 5.0.5
Precision:          single
Memory model:       64 bit
MPI library:        thread_mpi
OpenMP support:     enabled GPU support:        enabled invsqrt
routine:    gmx_software_invsqrt(x)
SIMD instructions:  AVX2_256
FFT library:        fftw-3.3.4-fma-sse2-avx RDTSCP usage: enabled C++11
compilation:  disabled TNG support:        enabled
Tracing support:    disabled
Built on:           Tue Jul 21 16:25:29 CDT 2015
Built by:           kuczera at lolipop-chem-ku-edu [CMAKE]Build
OS/arch:      Linux 3.10.0-229.4.2.el7.x86_64 x86_64
Build CPU vendor:   GenuineIntel
Build CPU brand:    Intel(R) Xeon(R) CPU E5-2687W v3 @ 3.10GHzBuild CPU
family:   6   Model: 63   Stepping: 2Build CPU features: aes apic avx
avx2 clfsh cmov cx8 cx16 f16c fma htt lahf_lm m
mx msr nonstop_tsc pcid pclmuldq pdcm pdpe1gb popcnt pse rdrnd rdtscp
sse2 sse3 sse4.1 sse4.2 ssse3 tdt x2apicC compiler: /usr/bin/cc GNU
4.8.3C compiler flags:    -march=core-avx2 -Wno-maybe-uninitialized
-Wextra -Wno-missing-field-initializers -Wno-sign-compare
-Wpointer-arith -Wall -Wno-unused -Wunused-value -Wunused-parameter  -O3
-DNDEBUG -fomit-frame-pointer -funroll-all-
loops -fexcess-precision=fast  -Wno-array-bounds
C++ compiler:       /usr/bin/c++ GNU 4.8.3
C++ compiler flags:  -march=core-avx2    -Wextra
-Wno-missing-field-initializers -Wpointer-arith -Wall
-Wno-unused-function  -O3 -DNDEBUG -fomit-frame-pointer
-funroll-all-loops -fexcess-precision=fast  -Wno-array-bounds
Boost version:      1.53.0 (external)
CUDA compiler:      /usr/local/cuda-7.0/bin/nvcc nvcc: NVIDIA (R) Cuda
compiler driver;Copyright (c) 2005-2015 NVIDIA Corporation;Built on
Mon_Feb_16_22:59:02_CST_2015;Cuda compilation tools, release 7.0, V7.0.27
CUDA compiler
flags:-gencode;arch=compute_20,code=sm_20;-gencode;arch=compute_20,code=sm_21;-gencode;arch=compute_30,code=sm_30;-gencode;arch=compute_35,code=sm_35;-gencode;arch=compute_35,code=compute_35;-gencode;arch=compute_50,code=compute_50;-use_fast_math;-Xcompiler;-fPIC
;
;-march=core-avx2;-Wextra;-Wno-missing-field-initializers;-Wpointer-arith;-Wall;-Wno-unused-function;-O3;-DNDEBUG;-fomit-frame-pointer;-funroll-all-loops;-fexcess-precision=fast;-Wno-array-bounds;
CUDA driver:        7.0
CUDA runtime:       7.0


On 7/17/15 5:39 PM, Szil?rd P?ll wrote:
> Krzysztof,
>
> Whyle GROMACS 5.0.x build system does not explicitly generate options
> targeting 5.x devices, the binary built should still be compatible with
> your GPU. You can try adding the device-specific optimization flags with the
> -DCUDA_NVCC_FLAGS_RELEASE="-gencode;arch=compute_50,code=sm_50"
> cmake flag, but I'm not sure this fix the issue.
>
> What does gmx -version show?
>
> Cheers,
>
> --
> Szil?rd
>
> On Fri, Jul 17, 2015 at 11:49 PM, Krzysztof Kuczera <kkuczera at ku.edu> wrote:
>
>> Hi Group,
>>
>> I am getting a run-time error  on my Linux workstation with K620 GPU
>>    for the GPU version of GROMACS 5.0.5
>>    using gcc 4.8.3 and CUDA Toolkit 7.0  I had no problem compiling the code
>>    but got this error when starting a test case:
>>
>> Program mdrun, VERSION 5.0.5
>> Source code file:
>> /home/kuczera/prog/gromacs-5.0.5/src/gromacs/mdlib/nbnxn_cuda/
>> nbnxn_cuda.cu, line: 619
>>
>> Fatal error:
>> cudaStreamSynchronize failed in cu_blockwait_nb: an illegal memory access
>> was encountered
>>
>> Searching the Web, I found that this type of error was already resolved in
>> GROMACS 4.6 some time ago.
>> Could somebody suggest a solution?
>>
>> We have been able to compile and run GROMACS 5.0.4 on an older GPU with
>> compute capability 3.5,
>> my newer K620 has compute capability 5.0 - could this be the problem ?
>>
>> Thanks
>> Krzysztof
>>
>> --
>> Krzysztof Kuczera
>> Departments of Chemistry and Molecular Biosciences
>> The University of Kansas
>> 1251 Wescoe Hall Drive, 5090 Malott Hall
>> Lawrence, KS 66045
>> Tel: 785-864-5060 Fax: 785-864-5396 email: kkuczera at ku.edu
>> http://oolung.chem.ku.edu/~kuczera/home.html
>>
>> --
>> Gromacs Users mailing list
>>
>> * Please search the archive at
>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
>> posting!
>>
>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
>> * For (un)subscribe requests visit
>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>> send a mail to gmx-users-request at gromacs.org.
>>


--
Krzysztof Kuczera
Departments of Chemistry and Molecular Biosciences
The University of Kansas
1251 Wescoe Hall Drive, 5090 Malott Hall
Lawrence, KS 66045
Tel: 785-864-5060 Fax: 785-864-5396 email: kkuczera at ku.edu
http://oolung.chem.ku.edu/~kuczera/home.html



------------------------------

--
Gromacs Users mailing list

* Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-request at gromacs.org.

End of gromacs.org_gmx-users Digest, Vol 135, Issue 121
*******************************************************


More information about the gromacs.org_gmx-users mailing list