[gmx-developers] CVS code on Opteron/Xeon EMT64/IT2/G5

Erik Lindahl lindahl at sbc.su.se
Fri Feb 18 22:56:30 CET 2005


Hi Shervin,

First, I assume this is the CVS code, right?

While it is great that you're testing it, I want to stress that it's 
not quite ready for production use without careful testing to make sure 
you get the same results.

I'm interested in the error you're seeing, and would be happy to have a 
look if you send me:

1. A detailed report of _exactly_ what the error is. The log files help 
a lot, since they contain info like the architecture detection.
2. The run input files (.tpr) for the two cases (e.g. 1 vs. 2 nodes) 
that differ.
3. If you have it, all the files (mdp,gro,top) to regenerate the input. 
In some cases it is easier to debug if we can reduce the system.

I recently found out that at least one home-grown mpicc script for the 
intel compilers on ia64 didn't understand the assembly files, so all 
bug reports related to ia64 are very welcome.

I haven't had time to experiment with compiler flags on ia64, so for 
now you should just make sure that -O3 is used. The innerloops are now 
often so fast that the neighborsearching and other parts become the 
bottleneck, so we'd be happy for recommendations about special flags 
that might improve the Intel compiler performance.

Cheers,

Erik


On Feb 18, 2005, at 8:50 PM, Sherwin J. Singer wrote:

> Mike:
>
> Are you running serial on a single CPU, or parallel?
>
> Students in my group are running Gromacs on an itanium cluster at the
> Ohio Supercomputer Center (www.osc.edu/hpc/computing/ipf), and we are
> having some strange problems with the Coulomb(LR) energy.
>
> Basically, the Coulomb(LR) energy evaluates to different numerical
> values depending on the number of nodes being used and the size of the
> simulated system.  The difference is substantial, and is not a roundoff
> effect.
>
> We would appreciate if we could look for the same effect on your
> itanium, if it is a cluster.  If the problem is reproduced, there could
> be a bug in Gromacs.  If the problem is not reproduced, we would love 
> to
> transfer your knowledge of how to compile Gromacs to the system
> administrator at the Ohio Supercomputer Center.
>
> Are you able to help us out?
>
>
>
>
>
> On Fri, 2005-02-18 at 09:53, Mike Sullivan wrote:
>> I am  running the gmx bench cases under Suse Linux Enterprise 9.0 on
>> Opteron, Xeon EMT64, Itanium2 and G5 using the CVS version of Gromacs.
>> The hardware is available on line for anyone that wants to use it for
>> Gromacs
>>  development.  The G5 will most likely have to go back to Apple at the
>> end of
>> next week ( April 25/2005 ) so if you want to run some tests let me 
>> know
>> asap.
>> Please send me an email if you would like an account.
>>
>>
>>                                                          Regards
>>
>>                                                          Mike
>>
>>
> -- 
>
> Sherwin Singer
> Department of Chemistry
> Ohio State University
> 100 W. 18th Ave.
> Columbus, OH 43210
> 614-292-8909
> 614-292-1685 (fax)
>
> _______________________________________________
> gmx-developers mailing list
> gmx-developers at gromacs.org
> http://www.gromacs.org/mailman/listinfo/gmx-developers
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-developers-request at gromacs.org.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: smime.p7s
Type: application/pkcs7-signature
Size: 2363 bytes
Desc: not available
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-developers/attachments/20050218/95ac5190/attachment.p7s>


More information about the gromacs.org_gmx-developers mailing list