You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using shap to understand in which parts of the text the model its putting attention on, the following happens:
As the probabilities of some transformers are heavily located on the extremes (or really close to 0 or to 1), the colours on the shap representation are very dark and it's difficult to read the words inside to make an analysis. Just by changing the intensity of this colours would be enough.
The text was updated successfully, but these errors were encountered:
Hi @saraesteveznewtral, there is definitely a reading and usability problem, we will improve it as soon as possible. thank you for reporting
Amelie-V
added
area: ui
Indicates that an issue or pull request is related to the User Interface (UI)
type: bug
Indicates an unexpected problem or unintended behavior
labels
Oct 28, 2022
When using shap to understand in which parts of the text the model its putting attention on, the following happens:
As the probabilities of some transformers are heavily located on the extremes (or really close to 0 or to 1), the colours on the shap representation are very dark and it's difficult to read the words inside to make an analysis. Just by changing the intensity of this colours would be enough.
The text was updated successfully, but these errors were encountered: