Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

View partial results for inference #611

Open
lukeyeager opened this issue Feb 29, 2016 · 1 comment
Open

View partial results for inference #611

lukeyeager opened this issue Feb 29, 2016 · 1 comment
Assignees

Comments

@lukeyeager
Copy link
Member

When running inference (one or more test images), the results page should be shown immediately as with other jobs. Then the results (layer visualizations, individual classifications, confusion matrix updates, etc.) can be sent incrementally over SocketIO.

This feature has been mentioned several times before:

I think the best way to solve this would be with the intermediate results / progress bar feature I mentioned earlier. Then the page can return quickly with a "0/1,300,000 images processed" page, and then slowly return the data as it comes through.
#70 (comment)

The basic issue is that inference is that the request doesn't return a response until all of the classifications are complete (which could take a very long time).
#479 (comment)

Can we show the inference page immediately, like we do for other jobs? That would require sending the resulting data to the page through SocketIO. And if we're already sending data over SocketIO, can we send it incrementally as we get it?
#573 (comment)

@gheinrich
Copy link
Contributor

Thanks Luke, I'll work on this at a priority!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants