New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
INT_MAX error #44
Comments
It means that the number of query-key pairs is too large. You may consider setting a larger voxel_size. |
Thank you for your answer. I thought maybe its caused so something else, because the error I get with larger grids is
For features I use an (N, 3) pointcloud, and also the same for Coordinates. My model settings are :
Basically I reduced the model to be small because the pointclouds I work with are so large I rather just get it running in a "minimum" configuration, as an initial step. |
Update : Error disappears with 'stem_transformer' = False, so I guess there is an issue there |
The error |
@X-Lai Thank you, that solved the error. However now I get the n_max error. Do you have an overview how/what parameters to set? The parameters are a little in-transparent to me, so far I have not been able to use the model with external pointclouds or random tensors. |
The parameters like |
@X-Lai How exactly does quant_size relate to the voxels and grid sizes? What are the downsides of increasing the value? I've been needing to use a quant size >= 0.1 to prevent the n_max error. Do I understand correctly that quant affects the samping inside the transformer windows? |
Hello,
I am encountering the following error, when using the model on larger pointclouds. I have reduced the model size and batch size, however I still get the following error.
Do you know what causes this? I havent been able to find anything useful online.
The text was updated successfully, but these errors were encountered: