My preference would be to increase the tolerance. In my experience, many
of the TI and free energy tests suffer this sort of problem. I'm not sure
what makes these applications more sensitive, but it probably has to do
with the numerical precision and convergence issues innate to the methods
themselves, the fundamental reason for running many windows and such. But
I am only guessing. But insofar as I beileve this to be a more widespread
problem, we should keep whatever tests we can.
Dave
On Sat, Nov 14, 2020 at 5:28 PM Scott Le Grand <varelse2005.gmail.com>
wrote:
> This test appears to be ever so slightly nondeterministic. About 1 in 4
> runs on TOT is showing this tiny difference. I suspect is because DPFP
> energy accumulation is not quite deterministic. Which is preferable here?
> Increasing the tolerance by a factor 2 or ignoring the test?
>
> possible FAILURE: check md_SC_NVT_SC_-1.o.dif
> /media/work/slegrand/amber/test/cuda/gti/SC_Correction/complex
> 3158c3158
> < Etot = 3.9278 EKtot = 146.6202 EPtot =
> 145.6113
> > Etot = 3.9277 EKtot = 146.6202 EPtot =
> 145.6113
> ### Maximum absolute error in matching lines = 1.00e-04 at line 3158 field
> 3
> ### Maximum relative error in matching lines = 2.55e-05 at line 3158 field
> 3
> _______________________________________________
> AMBER-Developers mailing list
> AMBER-Developers.ambermd.org
> http://lists.ambermd.org/mailman/listinfo/amber-developers
>
_______________________________________________
AMBER-Developers mailing list
AMBER-Developers.ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber-developers
Received on Sat Nov 14 2020 - 16:00:03 PST