Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Score info] In NER #1379

Closed
Amelie-V opened this issue Apr 5, 2022 · 16 comments · Fixed by #1389 or #1451
Closed

[Score info] In NER #1379

Amelie-V opened this issue Apr 5, 2022 · 16 comments · Fixed by #1389 or #1451
Assignees
Labels
area: ui Indicates that an issue or pull request is related to the User Interface (UI)
Projects

Comments

@Amelie-V
Copy link
Member

Amelie-V commented Apr 5, 2022

No description provided.

@Amelie-V Amelie-V created this issue from a note in Release (Backlog) Apr 5, 2022
@Amelie-V Amelie-V moved this from Backlog to Planified in Release Apr 5, 2022
@Amelie-V
Copy link
Member Author

Amelie-V commented Apr 5, 2022

update Filter score style

  • Update modal style (Grey BG, shadow, No border, Change "results" color to blue #0508d9 and score line to purple #4c4ea3.
  • Open score modal just under selector (as all filters), should fit into selector width

Prediction tooltip

  • Show prediction % in Prediction tooltip (explore and annotate mode)
  • In general add a pixel line darker on predicted token tooltip darker

@Amelie-V
Copy link
Member Author

Amelie-V commented Apr 5, 2022

Captura de pantalla 2022-04-05 a las 15 25 39

@dcfidalgo
Copy link
Contributor

Just a comment, more for a second iteration: I think it would be useful to see the scores without having to hover over the predictions. This can be useful when one has longer records with a lot of predictions, and you use the score filter. Now you have to hover over all the predictions to figure out, which of the predictions matched the filter (usually the predictions with a low score).

@frascuchon
Copy link
Member

Just a comment, more for a second iteration: I think it would be useful to see the scores without having to hover over the predictions. This can be useful when one has longer records with a lot of predictions, and you use the score filter. Now you have to hover over all the predictions to figure out, which of the predictions matched the filter (usually the predictions with a low score).

Do make sense allow to see only predictions matching the score used in filters? I mean, for records with a lot of labels, maybe make sense show only affected entities instead of all. Including an score for all entities could overload info shown to user, even more if the records have a lot of labels.

@dvsrepo
Copy link
Member

dvsrepo commented Apr 6, 2022

Just a comment, more for a second iteration: I think it would be useful to see the scores without having to hover over the predictions. This can be useful when one has longer records with a lot of predictions, and you use the score filter. Now you have to hover over all the predictions to figure out, which of the predictions matched the filter (usually the predictions with a low score).

Do make sense allow to see only predictions matching the score used in filters? I mean, for records with a lot of labels, maybe make sense show only affected entities instead of all. Including an score for all entities could overload info shown to user, even more if the records have a lot of labels.

Yes, I agree with both. As David mentions, I'd leave this for a second iteration and open a design issue to analyze the different options (showing only affected entities, etc.) cc @Amelie-V

@Amelie-V
Copy link
Member Author

Amelie-V commented Apr 6, 2022

as @dcfidalgo mentioned it's a useful info with low score pred: Do you think it's correct showing the % without decimal? ex: 1% or 0%?

@dcfidalgo
Copy link
Contributor

Do make sense allow to see only predictions matching the score used in filters? I mean, for records with a lot of labels, maybe make sense show only affected entities instead of all. Including an score for all entities could overload info shown to user, even more if the records have a lot of labels.

This could be another option, I think we really have to see it ... maybe hiding predictions can be confusing, maybe showing all scores can be overwhelming.

as @dcfidalgo mentioned it's a useful info with low score pred: Do you think it's correct showing the % without decimal? ex: 1% or 0%?

I think most of the time, the percentage without decimals is more than sufficient. Maybe it would still be nice if the user could somehow consult the full number if requiered.

@Amelie-V
Copy link
Member Author

Amelie-V commented Apr 6, 2022

Captura de pantalla 2022-04-06 a las 11 29 49

leiyre added a commit that referenced this issue Apr 7, 2022
closes #1379
This PR includes score in prediction tooltip, improves score filter styles and prediction highlighting in text
@frascuchon frascuchon moved this from Planified to In progress in Release Apr 7, 2022
@Amelie-V
Copy link
Member Author

Amelie-V commented Apr 13, 2022

After doing a last review, I also found confusing to show entities that do not match with the score filter just because they are making part of the record.

Reduce opacity to entities that doesn't match with filters

I would recommend to implement from now to show all entities out of the filter with a light opacity to distinguish them from the other.
e.g. Score filter = 0% to 10%. Entity > 10% = Show entity (underlined + tooltip) into the record with opacity to 70% .

This behavior should be generalized to all filters.
e.g. "predicted_as : ORG" in both mode: Explore and Annotate

Leave 2 decimals to score as a max in score dropdown

e.g. 0% to 14.00000000002% -> 0% to 14% , 0% to 14.05%

@Amelie-V
Copy link
Member Author

Captura de pantalla 2022-04-13 a las 11 46 48

@frascuchon frascuchon moved this from In progress to Pending QA in Release Apr 19, 2022
frascuchon pushed a commit that referenced this issue Apr 19, 2022
closes #1379
This PR includes score in prediction tooltip, improves score filter styles and prediction highlighting in text
@frascuchon frascuchon added the area: ui Indicates that an issue or pull request is related to the User Interface (UI) label Apr 25, 2022
@Amelie-V
Copy link
Member Author

Amelie-V commented Apr 26, 2022

@leiyre The layout is broken in annotation mode. (view SS)
DS: mock_tokenclass_scores_for_leire

Also, I would recommend in annotation mode to show annotation highlight in the same tone as the prediction which means: if the prediction is different from the score range, apply transparency to both: the prediction underlines and annotation highlight. what do you think @frascuchon @dcfidalgo @leiyre @dvsrepo

@Amelie-V
Copy link
Member Author

Captura de pantalla 2022-04-25 a las 21 30 36

@frascuchon frascuchon moved this from Pending Review to In progress in Release Apr 26, 2022
@dcfidalgo
Copy link
Contributor

Not sure if the score filter (which is a filter for the predictions) should affect the annotations. But in general, I like the idea of somehow "hiding" the predictions that are not in the score range, by making them more transparent. I think we need to test this.

@Amelie-V
Copy link
Member Author

Yes I understand that the score is affecting only the prediction. In any case It's better to wait for the fix and test it all together.

frascuchon pushed a commit that referenced this issue Apr 26, 2022
closes #1379
This PR includes score in prediction tooltip, improves score filter styles and prediction highlighting in text
frascuchon pushed a commit that referenced this issue Apr 26, 2022
closes #1379
This PR includes score in prediction tooltip, improves score filter styles and prediction highlighting in text
frascuchon pushed a commit that referenced this issue Apr 26, 2022
closes #1379
This PR includes score in prediction tooltip, improves score filter styles and prediction highlighting in text
frascuchon pushed a commit that referenced this issue Apr 26, 2022
closes #1379
This PR includes score in prediction tooltip, improves score filter styles and prediction highlighting in text
@leiyre leiyre moved this from In progress to Pending Review in Release Apr 26, 2022
@frascuchon
Copy link
Member

Like @dcfidalgo says, is a bit confusing for me to hide annotations based on the score filter that is based on prediction scores.

@Amelie-V
Copy link
Member Author

Prediction visualisation suggestion

When the prediction is inside of the score range: strong 1 px underline color line + color underline
When the prediction is outside of the score range: only color underline

@frascuchon frascuchon moved this from Pending Review to In progress in Release Apr 28, 2022
@leiyre leiyre moved this from In progress to Pending Review in Release Apr 28, 2022
@Amelie-V Amelie-V moved this from Pending Review to Review OK in Release May 3, 2022
@frascuchon frascuchon moved this from Review OK to Waiting Release in Release May 4, 2022
@frascuchon frascuchon linked a pull request May 4, 2022 that will close this issue
@frascuchon frascuchon moved this from Waiting Release to Ready to Release QA in Release May 4, 2022
frascuchon pushed a commit that referenced this issue May 4, 2022
(cherry picked from commit 3f7eec2)

- fix: remove wrong annotations from Token Classifier in explore view (#1451)
(cherry picked from commit 49a08c3)
@Amelie-V Amelie-V moved this from Ready to Release QA to Approved Release QA in Release May 5, 2022
frascuchon pushed a commit that referenced this issue May 10, 2022
(cherry picked from commit 3f7eec2)

- fix: remove wrong annotations from Token Classifier in explore view (#1451)
(cherry picked from commit 49a08c3)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area: ui Indicates that an issue or pull request is related to the User Interface (UI)
Projects
No open projects
Release
Approved Release QA
5 participants