Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Include per-scene eval metrics #641

Merged
merged 1 commit into from Jan 28, 2019
Merged

Include per-scene eval metrics #641

merged 1 commit into from Jan 28, 2019

Conversation

@lewfish
Copy link
Contributor

@lewfish lewfish commented Dec 26, 2018

Overview

This PR makes it so that the metrics for each individual scene are also included in eval.json. This can be useful for comparing performance across different cities. The format of the output has changed to the following:

{   "overall": [   {   "class_id": 1,
                       "class_name": "one",
                       "count_error": 50,
                       "f1": 0.6666666666666666,
                       "gt_count": 100,
                       "precision": 1.0,
                       "recall": 0.5},
                   {   "class_id": 2,
                       "class_name": "two",
                       "count_error": 50.0,
                       "f1": 0.6666666666666666,
                       "gt_count": 100,
                       "precision": 1.0,
                       "recall": 0.5},
                   {   "class_id": null,
                       "class_name": "average",
                       "count_error": 50.0,
                       "f1": 0.6666666666666666,
                       "gt_count": 200,
                       "precision": 1.0,
                       "recall": 0.5}],
    "per_scene": {   "1": [ ... ],
                              "2": [ ... ] }
}

Testing

This adds a unit test to cover the new functionality.

Closes #637

@lewfish lewfish added the review label Dec 26, 2018
@lewfish lewfish changed the title WIP: Include per-scene eval metrics Include per-scene eval metrics Dec 26, 2018
@lewfish lewfish force-pushed the lf/multi-eval branch 4 times, most recently from 5816101 to 75808d6 Dec 26, 2018
@codecov
Copy link

@codecov codecov bot commented Dec 31, 2018

Codecov Report

Merging #641 into develop will increase coverage by 0.3%.
The diff coverage is 92.85%.

Impacted file tree graph

@@            Coverage Diff             @@
##           develop     #641     +/-   ##
==========================================
+ Coverage    70.76%   71.07%   +0.3%     
==========================================
  Files          171      171             
  Lines         8067     8077     +10     
==========================================
+ Hits          5709     5741     +32     
+ Misses        2358     2336     -22
Impacted Files Coverage Δ
...astervision/evaluation/classification_evaluator.py 96.29% <100%> (ø) ⬆️
...stervision/evaluation/classification_evaluation.py 95.91% <100%> (+13.86%) ⬆️
...sion/evaluation/semantic_segmentation_evaluator.py 68.42% <50%> (+42.1%) ⬆️
...ion/evaluation/semantic_segmentation_evaluation.py 55.31% <0%> (+1.06%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 7097791...0178c1c. Read the comment docs.

@lewfish lewfish force-pushed the lf/multi-eval branch from 75808d6 to cd9fc95 Jan 3, 2019
@lewfish lewfish force-pushed the lf/multi-eval branch from cd9fc95 to 08e04fe Jan 16, 2019
Copy link
Member

@jamesmcclain jamesmcclain left a comment

Looks good to me.

@lewfish lewfish merged commit 01148f3 into develop Jan 28, 2019
2 checks passed
2 checks passed
continuous-integration/travis-ci/pr The Travis CI build passed
Details
continuous-integration/travis-ci/push The Travis CI build passed
Details
@lewfish lewfish deleted the lf/multi-eval branch Jan 28, 2019
@lewfish lewfish removed the review label Jan 28, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Linked issues

Successfully merging this pull request may close these issues.

None yet

2 participants