-
Notifications
You must be signed in to change notification settings - Fork 76
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Evaluation on KITTI-360 Test set #30
Comments
Hi @hansoogithub, thanks for your interest in the project.
This is normal behavior. KITTI-360's test set has held-out labels. Meaning you do not have access to the labels for performance evaluation, those are stored on a benchmarking server (see the official website. So the local performance evaluation of SPT can only be run on the validation set, as communicated in our paper. This is why you see empty test performance when running
This is unrelated to the above comment. You can safely ignore this warning. Best, Damien |
how to get evaluation metrics on validation data[kitti-360]? where should I mention that to get evaluation metrics? |
I have a problem viewing the performance evaluation numbers when i run
python src/eval.py experiment=kitti360 ckpt_path='downloaded checkpoint from your website'
below is the result i got
but when i trained a new model from scratch with the kitti360
i can get view the numbers during training
I get this warning during evaluation
"You are using cuda device ('nvidia geforce rtx 4090') that has Tensor Cores. To properly utilize them you should set 'torch.set_float32_matmul_precision('medium' | 'high) which will trade-off precision for performance. For more details read https://pytorch.org/docs/stable/generated.torch.set_float32_matmul_precision.html#torch.set_float32_matmul_precision
i tried using high and medium precision but theres no change in the evaluation result.
i used this project in a docker container with gpu passthrough according to your setup. cuda11.8
please help, thank you for the project
The text was updated successfully, but these errors were encountered: