Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix a bug in CUDA scratch memory pool #4673

Merged
merged 1 commit into from
Jan 13, 2022

Conversation

Rombur
Copy link
Member

@Rombur Rombur commented Jan 13, 2022

There is a bug when we try to acquire an object in the scratch memory pool. We basically do the opposite of what we want to do, if we can acquire a free object we mark it as busy and then, we try to acquire the next one. The loop exits when we find a busy object. The code is unique to CUDA and does not appear in HIP or SYCL.

Copy link
Member

@dalg24 dalg24 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see how this was wrong but it is not obvious to me how that was causing the issue reported in #4665

Would you please elaborate.

@dalg24
Copy link
Member

dalg24 commented Jan 13, 2022

Retest this please

@dalg24
Copy link
Member

dalg24 commented Jan 13, 2022

Would you please elaborate.

Never mind I got it. The first time around it marks all spots in the pool as in-use and take the 1st one. The second time around it marks the 1st one as in-use and use the 2nd.

[ ] 1  1  1
 1 [ ] 1  1
[ ] 0  1  1
 1  1 [ ] 1
[ ] 1  0  1
 1 [ ] 0  1
[ ] 0  0  1
 1  1  1 [ ]
[ ] 1  1  0
 1 [ ] 1  0
[ ] 0  1  0
 1  1 [ ] 0
etc.

Eventually it uses all 10 spots.

@dalg24 dalg24 merged commit 1d71653 into kokkos:develop Jan 13, 2022
@dalg24
Copy link
Member

dalg24 commented Jan 13, 2022

All CUDA builds passed. Not bothering to wait for SYCL.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants