-
Notifications
You must be signed in to change notification settings - Fork 56
"standard_error" on submission #49
Comments
Hi Devendra, that's not expected behavior of EvalAI and will be fixed. We will try our best to provide you error logs. |
Hi @devendrachaplot , your submission to the minival phase is timing out as it isn't able to run through all the episodes in 30 mins. Please close this issue if this resolves your query. |
It is easy to bring your run-time down? We can bump the time-out limit but it will reduce your iteration cycles when you get to test-std (which is 100x larger than minival). |
It will be great if the time-out limit can be increased! The local evaluation takes around 16 mins on my machine. Is there a way to see the output of the model? Our submission prints FPS. I am trying to figure out the ratio of FPS from my machine to remote evaluation. |
We're working on all of these points. |
Thanks, the time limit issue was mostly because of inconsistent docker base image (https://github.com/facebookresearch/habitat-challenge/issues/33#issuecomment-629741382). Fixing the base docker image reduced the runtime by around 50%. Also, just for your reference, the remote eval (with the fixed docker base image) is now taking around 2x time as compared to my local evaluation on with 1080 Ti GPU (14 mins vs 7 mins). |
Hi,
My submission to the evalai portal failed and it is showing just "standard_error" in the Stderr File.
Could you help in figuring out the error?
The text was updated successfully, but these errors were encountered: