Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

"standard_error" on submission #49

Closed
devendrachaplot opened this issue May 14, 2020 · 6 comments
Closed

"standard_error" on submission #49

devendrachaplot opened this issue May 14, 2020 · 6 comments

Comments

@devendrachaplot
Copy link
Contributor

Hi,
My submission to the evalai portal failed and it is showing just "standard_error" in the Stderr File.
Could you help in figuring out the error?

@mathfac
Copy link
Contributor

mathfac commented May 14, 2020

Hi Devendra, that's not expected behavior of EvalAI and will be fixed. We will try our best to provide you error logs.

@RishabhJain2018
Copy link
Contributor

Hi @devendrachaplot , your submission to the minival phase is timing out as it isn't able to run through all the episodes in 30 mins.
(We will fix the error message at our end to be more meaningful.)

Please close this issue if this resolves your query.

@dhruvbatra
Copy link
Contributor

It is easy to bring your run-time down? We can bump the time-out limit but it will reduce your iteration cycles when you get to test-std (which is 100x larger than minival).

@devendrachaplot
Copy link
Contributor Author

devendrachaplot commented May 15, 2020

It will be great if the time-out limit can be increased! The local evaluation takes around 16 mins on my machine. Is there a way to see the output of the model? Our submission prints FPS. I am trying to figure out the ratio of FPS from my machine to remote evaluation.

@dhruvbatra
Copy link
Contributor

We're working on all of these points.

@devendrachaplot
Copy link
Contributor Author

Thanks, the time limit issue was mostly because of inconsistent docker base image (https://github.com/facebookresearch/habitat-challenge/issues/33#issuecomment-629741382). Fixing the base docker image reduced the runtime by around 50%.

Also, just for your reference, the remote eval (with the fixed docker base image) is now taking around 2x time as compared to my local evaluation on with 1080 Ti GPU (14 mins vs 7 mins).

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants