-
Notifications
You must be signed in to change notification settings - Fork 174
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add warning if GPU model grid Nx, Ny are not multiple of 16 #398
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
<3 for using grid.Nx
, etc.
Hmm, actually do you want to move our hard-coded values of |
Codecov Report
@@ Coverage Diff @@
## master #398 +/- ##
==========================================
+ Coverage 64.62% 65.99% +1.36%
==========================================
Files 23 23
Lines 1405 1532 +127
==========================================
+ Hits 908 1011 +103
- Misses 497 521 +24
Continue to review full report at Codecov.
|
Codecov Report
@@ Coverage Diff @@
## master #398 +/- ##
==========================================
+ Coverage 64.62% 70.45% +5.82%
==========================================
Files 23 23
Lines 1405 1682 +277
==========================================
+ Hits 908 1185 +277
Misses 497 497
Continue to review full report at Codecov.
|
Add warning if GPU model grid Nx, Ny are not multiple of 16 Former-commit-id: 56ca04c
Add a warning if trying to create a GPU model with a grid where
Nx
orNy
are not a multiple of 16.Some GPU kernels still use the hard-coded
Tx = Ty = 16
for calculating thread-block layouts. See PR #308 for why.Even if we were to get rid of the hard-coded
Tx, Ty
there will always be weird grid sizes like 17x33x47 that you can run no problem on a CPU, but might not be able to create a perfect thread-block layout for without having extra threads that do nothing, but extra threads can do bad stuff like apply a boundary condition twice if we're not careful.Apologies to @beta-effect who was burned by this in the past.