[gmx-users] Parallel do_dssp analysis over mpi?

Erik Marklund erikm at xray.bmc.uu.se
Mon Nov 8 22:33:40 CET 2010


Justin A. Lemkul skrev 2010-11-08 21.05:
>
>
> Erik Marklund wrote:
>> You can use the -b and -e flags to analyze parts of your trajectory 
>> and patch the results together afterwards.
>
> But does that really save time?  Patching .xpm files is not entirely 
> straightforward.  Might end up taking a lot more time :)
>
Nevertheless, if the 48 h runtime is a problem, then splitting the 
analysis into shorter chunks is the way to go here. Note however that an 
MPI-powered do_dssp would not make the results come any faster than what 
I suggested, but it would be much more convenient since one wouldn't 
have to patch together the xpm oneself. I think there's a 
frame-decomposition framework for the analysis tools under production, 
but I suspect it's far from ready.

Erik
> -Justin
>
>>
>> Erik
>>
>> Justin A. Lemkul skrev 2010-11-08 20.51:
>>>
>>>
>>> Ali Naqvi wrote:
>>>> Dear All,
>>>> I have a trajectory of 100ns and have been trying to analyze it 
>>>> using dssp program. I guess this question also goes for other 
>>>> analysis programs like g_rama, g_rdf and what not.
>>>>
>>>> Since the trajectory is big, it takes hours to analyze with 
>>>> do_dssp.. infact 48 hours. I have been able to analyse only one set 
>>>> of 100ns and have 7 more trajectories. Is it possible to get it 
>>>> running in parallel to expedite the analysis? If so, which flags 
>>>> need to be appended?
>>>>
>>>
>>> The only Gromacs program that is MPI-aware is mdrun.  Unless modify 
>>> the source code, then none of the other commands will run in parallel.
>>>
>>> -Justin
>>>
>>>> Cordially,
>>>> Ali
>>>>
>>>
>>
>>
>


-- 
-----------------------------------------------
Erik Marklund, PhD student
Dept. of Cell and Molecular Biology, Uppsala University.
Husargatan 3, Box 596,    75124 Uppsala, Sweden
phone:    +46 18 471 4537        fax: +46 18 511 755
erikm at xray.bmc.uu.se    http://folding.bmc.uu.se/




More information about the gromacs.org_gmx-users mailing list