Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(#1238): show prediction labels when annotating rule #1239

Merged
merged 9 commits into from Mar 10, 2022

Conversation

leiyre
Copy link
Member

@leiyre leiyre commented Mar 9, 2022

This PR shows prediction labels when annotating rule in Weak Supervision

Closes #1238

fix #1238
This PR shows prediction labels when annotating rule in Weak Supervision
@leiyre leiyre requested a review from frascuchon March 9, 2022 15:05
@codecov
Copy link

codecov bot commented Mar 10, 2022

Codecov Report

Merging #1239 (8c4057c) into master (630091f) will decrease coverage by 0.01%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #1239      +/-   ##
==========================================
- Coverage   94.67%   94.65%   -0.02%     
==========================================
  Files         127      127              
  Lines        5444     5446       +2     
==========================================
+ Hits         5154     5155       +1     
- Misses        290      291       +1     
Flag Coverage Δ
pytest 94.65% <100.00%> (-0.02%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
...rubrix/server/tasks/text_classification/metrics.py 100.00% <100.00%> (ø)
src/rubrix/server/commons/es_helpers.py 81.73% <0.00%> (-0.46%) ⬇️
...rubrix/labeling/text_classification/weak_labels.py 100.00% <0.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 630091f...8c4057c. Read the comment docs.

@frascuchon frascuchon merged commit 6f19e40 into master Mar 10, 2022
@frascuchon frascuchon deleted the fix/show_prediction_labels_in_weak_supervision branch March 10, 2022 14:29
frascuchon pushed a commit that referenced this pull request Mar 11, 2022
This PR shows prediction labels when annotating rule in Weak Supervision

Closes #1238

* fix(metrics): compute dataset labels as python metric

* test: fix tests

* fix: compute dataset label properly

* fix show all labels for empty query view

* empty query

* refactor: revert comp. changes and compute labels in model

* chore: lint fix

Co-authored-by: Francisco Aranda <francisco@recogn.ai>
(cherry picked from commit 6f19e40)
frascuchon pushed a commit that referenced this pull request Mar 11, 2022
This PR shows prediction labels when annotating rule in Weak Supervision

Closes #1238

* fix(metrics): compute dataset labels as python metric

* test: fix tests

* fix: compute dataset label properly

* fix show all labels for empty query view

* empty query

* refactor: revert comp. changes and compute labels in model

* chore: lint fix

Co-authored-by: Francisco Aranda <francisco@recogn.ai>
(cherry picked from commit 6f19e40)
frascuchon pushed a commit that referenced this pull request Mar 11, 2022
This PR shows prediction labels when annotating rule in Weak Supervision

Closes #1238

fix: remove unused imports

* fix(metrics): compute dataset labels as python metric

* test: fix tests

* fix: compute dataset label properly

* fix show all labels for empty query view

* empty query

* refactor: revert comp. changes and compute labels in model

* chore: lint fix

Co-authored-by: Francisco Aranda <francisco@recogn.ai>
(cherry picked from commit 6f19e40)
frascuchon pushed a commit that referenced this pull request Mar 11, 2022
This PR shows prediction labels when annotating rule in Weak Supervision

Closes #1238

fix: remove unused imports

* fix(metrics): compute dataset labels as python metric

* test: fix tests

* fix: compute dataset label properly

* fix show all labels for empty query view

* empty query

* refactor: revert comp. changes and compute labels in model

* chore: lint fix

Co-authored-by: Francisco Aranda <francisco@recogn.ai>
(cherry picked from commit 6f19e40)
frascuchon pushed a commit that referenced this pull request Mar 11, 2022
This PR shows prediction labels when annotating rule in Weak Supervision

Closes #1238

* fix(metrics): compute dataset labels as python metric

* test: fix tests

* fix: compute dataset label properly

* fix show all labels for empty query view

* empty query

* refactor: revert comp. changes and compute labels in model

* chore: lint fix

fix: remove unused imports

fix: include missing imports

test: fix tests

Co-authored-by: Francisco Aranda <francisco@recogn.ai>
(cherry picked from commit 6f19e40)
frascuchon pushed a commit that referenced this pull request Mar 25, 2022
This PR shows prediction labels when annotating rule in Weak Supervision

Closes #1238

* fix(metrics): compute dataset labels as python metric

* test: fix tests

* fix: compute dataset label properly

* fix show all labels for empty query view

* empty query

* refactor: revert comp. changes and compute labels in model

* chore: lint fix

Co-authored-by: Francisco Aranda <francisco@recogn.ai>
(cherry picked from commit 6f19e40)
frascuchon pushed a commit that referenced this pull request Mar 28, 2022
This PR shows prediction labels when annotating rule in Weak Supervision

Closes #1238

* fix(metrics): compute dataset labels as python metric

* test: fix tests

* fix: compute dataset label properly

* fix show all labels for empty query view

* empty query

* refactor: revert comp. changes and compute labels in model

* chore: lint fix

Co-authored-by: Francisco Aranda <francisco@recogn.ai>
(cherry picked from commit 6f19e40)
frascuchon pushed a commit that referenced this pull request Mar 28, 2022
This PR shows prediction labels when annotating rule in Weak Supervision

Closes #1238

* fix(metrics): compute dataset labels as python metric

* test: fix tests

* fix: compute dataset label properly

* fix show all labels for empty query view

* empty query

* refactor: revert comp. changes and compute labels in model

* chore: lint fix

Co-authored-by: Francisco Aranda <francisco@recogn.ai>
(cherry picked from commit 6f19e40)
frascuchon pushed a commit that referenced this pull request Mar 30, 2022
This PR shows prediction labels when annotating rule in Weak Supervision

Closes #1238

* fix(metrics): compute dataset labels as python metric

* test: fix tests

* fix: compute dataset label properly

* fix show all labels for empty query view

* empty query

* refactor: revert comp. changes and compute labels in model

* chore: lint fix

Co-authored-by: Francisco Aranda <francisco@recogn.ai>
(cherry picked from commit 6f19e40)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Weak Supervision] prediction labels labels are not visible
2 participants