RE: amber-developers: Fw: How many atoms?

From: Yong Duan <duan.ucdavis.edu>
Date: Tue, 4 Dec 2007 17:30:56 -0800

 
I don't work on that type of systems either. But I don't think this is just
an advertising issue. There are (many) interesting systems and problems that
may require considerably large size. I can easily think a few such systems
that can be studied within our simulation time scales and good science may
emerge. With ranger and other peta-scale systems become online, simulations
on large systems start to make sense. However, to accommodate those large
systems, the file formats have to be changed. The code also needs changes
(likely messy). If it has not been changed (or has it been??), this is
probably too late to do.
 
 
yong
 
-----Original Message-----
From: owner-amber-developers.scripps.edu
[mailto:owner-amber-developers.scripps.edu] On Behalf Of Carlos Simmerling
Sent: Tuesday, December 04, 2007 2:35 PM
To: amber-developers.scripps.edu
Subject: Re: amber-developers: Fw: How many atoms?


Bob,
I have a BG/L (as you know) but none of my current systems are that large.
Carlos


On Dec 4, 2007 2:14 PM, Robert Duke <rduke.email.unc.edu > wrote:


Hello folks!
I am working hard on high-scaling pmemd code, and in the course of the work
it became clear to me, due to large async i/o buffer and other issues, that
going to very high atom counts may require a bunch of extra work, especially
on certain platforms (BG/L in particular...). I posed the question below to

Dave Case; he suggested I bounce it off the list, so here it is. The crux
of the matter is how people feel about having an MD capability in pmemd for
systems bigger than 999,999 atoms in the next release. Please respond to
the dev list if you have strong feelings in either direction.
Thanks much! - Bob

----- Original Message -----
From: "Robert Duke" <rduke.email.unc.edu >
To: "David A. Case" <case.scripps.edu>
Sent: Tuesday, December 04, 2007 8:45 AM
Subject: How many atoms?


> Hi Dave,
> Just thought I would pulse you about how strong the desire is to go above
> 1,000,000 atom systems in the next release. I personally see this as more
> an advertising issue than real science; it's hard to get good
> statistics/good science on 100,000 atoms let alone 10,000,000 atoms.
> However, we do have competition. So the prmtop is not an issue, but the
> inpcrd format is, and one thing that could be done is to move to
> supporting the same type of flexible format in the inpcrd as we do in the
> new-style prmtop. Tom D. has an inpcrd format in amoeba that would
> probably do the trick; I can easily read this in pmemd but not yet write
> it (I actually have pulled the code out - left it in the amoeba version
> of course, but can put it back in as needed). I ask the question now
> because I am hitting size issues already on BG/L on something like
> cellulose. Some of this I can fix; some of it really is more
> appropriately fixed by running on 64 bit memory systems where there
> actually is a multi-GB physical memory. The problem is particularly bad
> with some new code I am developing, due to extensive async i/o and
> requirements for buffers that at least theoretically could be pretty big
> (up to natom possible; by spending a couple of days writing really
> complicated code I can actually handle this in small amounts of space with

> effectively no performance impact - but it is the sort of thing that will
> be touchy and require additional testing). Anyway, I do want to gauge the
> desire to move up past 999,999 atoms, and make the point that on something

> like BG/L, it would actually require a lot more work to be able to run
> multi-million atom problems (basically got to go back and look at all the
> allocations, make them dense rather than sparse by doing all indexing
> through lists, allow for adaptive minimal i/o buffers, etc. etc. - messy
> stuff, some of it sourcing from having to allocate lots of arrays
> dimensioned by natom).
> Best Regards - Bob







-- 
===================================================================
Carlos L. Simmerling, Ph.D.
Associate Professor                 Phone: (631) 632-1336
Center for Structural Biology       Fax:   (631) 632-1555 
CMM Bldg, Room G80
Stony Brook University              E-mail: carlos.simmerling.gmail.com
Stony Brook, NY 11794-5115          Web: http://comp.chem.sunysb.edu
=================================================================== 
Received on Wed Dec 05 2007 - 06:07:35 PST
Custom Search