New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LoweringError while allocating shared memory #4709
Labels
Comments
Bug at type inference that should have stopped the use of non-constant value as shared memory size. |
Thanks for the report. This should have been caught earlier at type inference as the shared memory size must be a compile time constant. Marking as a bug. |
This is a
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
Numba version: 0.46.0
Running this code with "python error_minimal_execution.py":
Raises an error in 'b_cache' but not in 'a_cache':
However, running the same code with "NUMBA_ENABLE_CUDASIM=1 python error_minimal_execution.py" raises no error and I can debug normally.
The text was updated successfully, but these errors were encountered: