[gmx-users] Gromacs setup on dual core machine

David van der Spoel spoel at xray.bmc.uu.se
Wed Mar 12 08:59:11 CET 2008


Jens Pohl wrote:
> Hello!
> 
> I have a similar problem with installing gromacs on a dual core machine. 
> When I go through the procedure described by Matheus Fazian Ige Gondo a bit further down installing lam/mpi then fftw and finally gromacs everything seems to be working fine, but lam/mpi doesn't recognize the two cores and with mdrun there will be an fatal error. grompp and mdrun both were run with -np 2 and the output of mdrun says, that the file was made for two nodes but mdrun expected it to be for one node. 
> 
> With lamnodes there is only one node on cpu0 on localhost... What to do in this case?
> 
Fetch fftw 3.1.1 (latest stable)
./configure  --enable-float
make install

Fetch openmpi 1.1.x (latest stable)
./configure
make install

Fetch gromacs
./configure --enable-mpi
make install

That's all you need


> Thanks in advance
> Jens
> 
> 
>> Hello Ricardo
>>
>> Did you install with the sources ? (make install) ?
>> Go to the folder where you ran this command and run "make uninstall"
>>
>> Matheus Fazian Ige Gondo
>> Departamento de Física e Biofísica
>> Instituto de Biociências
>> Unesp - Botucatu - SP
>>  gondo at ibb.unesp.br
>> +55 (14) 3811-6254
>>
>>
>> On Tue, Mar 11, 2008 at 1:55 PM, Ricardo Soares <rsoares at fcfrp.usp.br> wrote:
>>  
>>
>> Kpiwara De X-nelo wrote:
>> Hello
>> You'll need FFTW (fftw.org) LAM/MPI (lam-mpi.org) and gromacs sources
>> Download the 3 packages
>> start with the Lam/MPI (configure , make ,  make install)
>> Then fftw (the 2.1.5 version is needed, the 3.2 supports it but is
>> still unstable) (configure --enable-mpi, make, make install)
>> Then, finally, the GROMACS (configure --with-fft=fftw2 --enable-mpi ,
>> make , make install)
>>  
>> It should be running
>> as a non root user type lamboot
>> It should start the lam/mpi
>>  
>> run your simulations as usual just changing these
>> grompp -v -f xx -o xx -c xx -p xx -np
>> #
>>  
>>  the -np generate statusfile for
>> # nodes
>>  
>> then
>>  
>>  mpirun c0 C
>> mdrun
>> -v -s xx -o xx -e xx -c xx -g x -np
>> # log >& xx
>>  
>>   Number of nodes (#) , must be the same as used for
>> grompp
>>  
>> the mpirun c0 C tells it to run on the maximun available nodes.
>>  
>> Hope I helped.
>>  
>> Matheus Fazian Ige Gondo
>> Departamento de Física e Biofísica
>> Instituto de Biociências
>> Unesp - Botucatu - SP
>>  gondo at ibb.unesp.br
>> +55 (14) 3811-6254
>> Hello Matheus, you sure did help!
>> But that way I'd need to uninstall gromacs and reinstall. Is there a
>> way to include the LAM/MPI package in an already installed instance of
>> Gromacs? If not, how do I uninstal it?
>>
>> Thanks again!
>> -- ___________________________________________________________
>> Ricardo Oliveira dos Santos Soares
>> Post-graduation Student in Biological Physics
>> University of Sao Paulo - USP
>> Faculty of Farmaceutical Sciences of Ribeirao Preto - FCFRP
>> Phone: 55 (16) 3602-4840
>> Curriculum Lattes - http://lattes.cnpq.br/0777038258459931
>> ___________________________________________________________
> _________________________________________________________________________
> In 5 Schritten zur eigenen Homepage. Jetzt Domain sichern und gestalten! 
> Nur 3,99 EUR/Monat! http://www.maildomain.web.de/?mc=021114
> 
> _______________________________________________
> gmx-users mailing list    gmx-users at gromacs.org
> http://www.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/search before posting!
> Please don't post (un)subscribe requests to the list. Use the 
> www interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php


-- 
David van der Spoel, Ph.D.
Molec. Biophys. group, Dept. of Cell & Molec. Biol., Uppsala University.
Box 596, 75124 Uppsala, Sweden. Phone:	+46184714205. Fax: +4618511755.
spoel at xray.bmc.uu.se	spoel at gromacs.org   http://folding.bmc.uu.se



More information about the gromacs.org_gmx-users mailing list