Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use of tersoff/gpu potentials with pair hybrid #108

Closed
ndtrung81 opened this issue Jul 1, 2016 · 3 comments
Closed

Use of tersoff/gpu potentials with pair hybrid #108

ndtrung81 opened this issue Jul 1, 2016 · 3 comments
Assignees

Comments

@ndtrung81
Copy link
Contributor

When tersoff/gpu, tersoff/zbl/gpu or tersoff/mod/gpu are used in pair hybrid, the run crashes due to bugs in device memory accesses. The input script reproduces the issue is attached.
in.hybrid.txt

@akohlmey
Copy link
Member

akohlmey commented Jul 1, 2016

There is one final and rather obscure issue that i observed running the test input. When using the attached input with 4 MPI tasks or more, the results are suddenly wrong. I've tracked this down to having to deal correctly with the case of having an empty neighbor list, which is much more likely for hybrid styles than for regular calculations. The problem seems to be caused by this line:
https://github.com/lammps/lammps/blob/lammps-icms/lib/gpu/lal_balance.h#L162
Where the number of "local" atoms is forced to be > 0. Unfortunately, changing it, so it will return zero, where there are no atoms in the list will result in kernel launch failures. On the other hand, when it is set to 1, it will copy data from the GPU to the host, that is bogus.

@akohlmey akohlmey reopened this Jul 1, 2016
@ndtrung81
Copy link
Contributor Author

Thanks for the hints, Axel. Indeed it involves an empty neighbor list of local atoms. For this case, the checks to return are currently not correct, which then allows the program to continue to the part where the balancer is invoked. I have fixed the checks and created a pull request for this issue. I tested the input script with 4, 6 and 8 MPI tasks and it seems to work correctly now. Please double check this on your side. Thanks!

@akohlmey
Copy link
Member

akohlmey commented Jul 2, 2016

Exellent! It seems to work for me, too.

jtclemm pushed a commit to jtclemm/lammps that referenced this issue Nov 23, 2020
Modify movie maker for python 3
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants