Ross Walker wrote:
>>Hi Wei,
>>
>>A quick test and it seems that sander.PIMD is not being built by default
>>either in serial or parallel but the default test suite is setup to run the
>>PIMD test cases. Can you check this please.
>>
>>
>
>
test.sander.PIMD and test.sander.PIMD.MPI should be totally remove from
test/Makefile.
>>
>>On another note what is the status of NEB under the new system? Does it need
>>to be run with sander.PIMD and I assume sander.PIMD.MPI? Also are things
>>linear in memory for this as well?
>>
>>
>
>
NEB has been implemented under the new framework, which means
you will use sander.MPI to do the calculation and things are linear
in memory in this as well. I am running test on it, and I will add a
test case on neb very soon. Also I want ask you, Do you think
partial NEB will be useful? since I might will be able to implement it.
>>On one other note ;-) does this mean a lot the ifdef PIMD stuff will be
>>going away? I notice for example that the qm_mm call in force is much much
>>simpler which is nice. However, there are still a lot of PIMD ifdefs in
>>force.f. Although a number of these are around variable declarations like
>>spring_energy or alloc_error. I assume these can go away now?
>>
>>
>
>
Yes, all the "#ifdef PIMD" will either be removed or be merged to
"#ifdef LES".
so does "#ifdef CMD", "#ifdef PIMD_A1ST" and "#ifdef CMD_A1ST" . The
code will be much simpler.
Sincerely,
Wei
>>All the best
>>Ross
>>
>>/\
>>\/
>>|\oss Walker
>>
>>| HPC Consultant and Staff Scientist |
>>| San Diego Supercomputer Center |
>>| Tel: +1 858 822 0854 | EMail:- ross.rosswalker.co.uk |
>>| http://www.rosswalker.co.uk | PGP Key available on request |
>>
>>Note: Electronic Mail is not secure, has no guarantee of delivery, may not
>>be read every day, and should not be used for urgent or sensitive issues.
>>
>>
>>
>
>
>>>>-----Original Message-----
>>>>From: owner-amber-developers.scripps.edu
>>>>[mailto:owner-amber-developers.scripps.edu] On Behalf Of Wei Zhang
>>>>Sent: Tuesday, October 17, 2006 09:26
>>>>To: amber-developers.scripps.edu
>>>>Subject: amber-developers: Major change on CVS (for new PIMD
>>>>implementation)
>>>>
>>>>Dear All,
>>>>
>>>>I just checked in a new implementation of PIMD which is based on
>>>>multi-sander framework.
>>>>Meanwhile, the following executables has been remove from Makefile:
>>>>sander.PIMD, sander.CMD
>>>>sander.PIMD.A1ST, sander.CMD.A1ST. In the new implementation, all PIMD
>>>>simulations are
>>>>run through "sander.MPI". A new control parameter "ipimd" has been
>>>>introduced to indicate
>>>>simulation type, "ipimd=0" means normal PIMD simulation,
>>>>"ipimd=1" means
>>>>primitive PIMD,
>>>>"ipimd=2" means normal-mode PIMD or CMD (depend on
>>>>"adiab_param"). Test
>>>>case for this
>>>>new implementation can be found under directory
>>>>"amber10/test/full_pimd".
>>>>
>>>>The new implementation has the following advantages. Firstly,
>>>>it is more
>>>>efficient because the
>>>>linear memory layout. Secondly, it is much easier to make it work with
>>>>other new features of
>>>>sander, for example: we can run PIMD with AMOEBA force field now.
>>>>Moreover, allthough AMOEBA
>>>>itself has not been parallelized yet, we can run AMEOBA-PIMD
>>>>in parallel
>>>>as long as number of CPUs
>>>>is less than number of beads.
>>>>
>>>>The shortcoming of the new implementation is also obvious, we will not
>>>>be able to run partial-PIMD
>>>>anymore. To overcome this, the old implementation will not be deleted
>>>>but will be merged into
>>>>sander.LES and we will be able to run partial-PIMD through sander.LES.
>>>>It will also be fast if the
>>>>quantum portion is reasonably small.
>>>>
>>>>As you know, this is a major change, although I have tested the
>>>>code(with ifort and g95), there are still
>>>>some chances that some of your staff have been broken. Thus, if you
>>>>experienced some
>>>>unusual behavior of sander after cvs update, please contact me, and we
>>>>can figure it out together.
>>>>
>>>>Thank you very much !
>>>>
>>>>Sincerely,
>>>>
>>>>Wei
>>>>
>>>>
>>>>
>>>>
>>
>>
>>
>>
>>
>>
>>
>>
>
>
Received on Wed Oct 18 2006 - 06:07:30 PDT