Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GUI for reviewing eval metrics #291

Closed
ntabris opened this issue Jan 29, 2020 · 5 comments
Closed

GUI for reviewing eval metrics #291

ntabris opened this issue Jan 29, 2020 · 5 comments

Comments

@ntabris
Copy link
Contributor

ntabris commented Jan 29, 2020

(details need to be worked out)

@ntabris
Copy link
Contributor Author

ntabris commented Apr 23, 2020

Mock up of table listing model for evaluation:

Whiteboard 1 -01

@ntabris
Copy link
Contributor Author

ntabris commented Apr 23, 2020

sortable table for comparing models:

  • metrics: oks map, visibility prec, 95% dist
  • each row shows different model
  • ability to re-run inference
  • labels from model, show timestamp and # of labels
  • button to inspect individual model in more detail

(notes from discussion on 3/31)

@ntabris
Copy link
Contributor Author

ntabris commented Jul 13, 2020

Basic version of dialog:

image

Doesn't yet have a way to re-run inference or get full metrics.

ntabris added a commit that referenced this issue Jul 14, 2020
@ntabris
Copy link
Contributor Author

ntabris commented Jul 14, 2020

There's now an "Evaluate Metrics for Trained Models..." command in the Predict menu which shows the table. Double-click on a row (model) to see full metrics, or use buttons under table:

image

image

There's no way to (re-)run inference... we can add another issue if we want to add this enhancement.

@ntabris ntabris closed this as completed Jul 14, 2020
ntabris added a commit that referenced this issue Jul 14, 2020
@safas25
Copy link

safas25 commented Apr 3, 2021

Basic version of dialog:

image

Doesn't yet have a way to re-run inference or get full metrics.

Hi, nothing appears for me when I try this option. It's a blank table. Please help?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants