Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add some accuracy (Sourcery refactored) #39

Merged
merged 1 commit into from
Nov 16, 2021

Conversation

sourcery-ai[bot]
Copy link

@sourcery-ai sourcery-ai bot commented Nov 16, 2021

Pull Request #38 refactored by Sourcery.

If you're happy with these changes, merge this Pull Request using the Squash and merge strategy.

NOTE: As code is pushed to the original Pull Request, Sourcery will
re-run and update (force-push) this Pull Request with new refactorings as
necessary. If Sourcery finds no refactorings at any point, this Pull Request
will be closed automatically.

See our documentation here.

Run Sourcery locally

Reduce the feedback loop during development by using the Sourcery editor plugin:

Review changes via command line

To manually merge these changes, make sure you're on the add_some_accuracy branch, then run:

git fetch origin sourcery/add_some_accuracy
git merge --ff-only FETCH_HEAD
git reset HEAD^

Help us improve this pull request!

@sourcery-ai sourcery-ai bot requested a review from ZhiboRao November 16, 2021 12:20
dice = 2 * intersection / union
return dice
return 2 * intersection / union
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function SegAccuracy.dice_score refactored with the following changes:

Comment on lines -96 to +98
confusion_matrix = torch.bincount(num_classes * gt[mask].int() + res[mask].int(),
minlength=num_classes**2).reshape(num_classes, num_classes)
return confusion_matrix
return torch.bincount(
num_classes * gt[mask].int() + res[mask].int(),
minlength=num_classes ** 2,
).reshape(num_classes, num_classes)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function SegAccuracy.generate_confusion_matrix refactored with the following changes:

precision = tp / (cm.sum(dim=0) + SegAccuracy.ACC_EPSILON)
return precision
return tp / (cm.sum(dim=0) + SegAccuracy.ACC_EPSILON)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function SegAccuracy.precision_score refactored with the following changes:

Comment on lines -113 to +112
recall = tp / (cm.sum(dim=1) + SegAccuracy.ACC_EPSILON)
return recall
return tp / (cm.sum(dim=1) + SegAccuracy.ACC_EPSILON)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function SegAccuracy.recall_score refactored with the following changes:

Comment on lines -121 to +119
pa = tp.sum() / (cm.sum() + SegAccuracy.ACC_EPSILON)
return pa
return tp.sum() / (cm.sum() + SegAccuracy.ACC_EPSILON)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function SegAccuracy.pa_score refactored with the following changes:

Comment on lines -160 to +155
fwiou = (tp * freq_weight / (each_class_counts + SegAccuracy.ACC_EPSILON)).sum()
return fwiou
return (tp * freq_weight / (each_class_counts + SegAccuracy.ACC_EPSILON)).sum()
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function SegAccuracy.fwiou_score refactored with the following changes:

Comment on lines -216 to +210
oa = tp.sum() / (cm.sum() + CDAccuracy.ACC_EPSILON)
return oa
return tp.sum() / (cm.sum() + CDAccuracy.ACC_EPSILON)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function CDAccuracy.oa_score refactored with the following changes:

Comment on lines -226 to +219
precision = tp / (cm.sum(dim=0) + CDAccuracy.ACC_EPSILON)
return precision
return tp / (cm.sum(dim=0) + CDAccuracy.ACC_EPSILON)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function CDAccuracy.precision_score refactored with the following changes:

Comment on lines -236 to +228
recall = tp / (cm.sum(dim=1) + CDAccuracy.ACC_EPSILON)
return recall
return tp / (cm.sum(dim=1) + CDAccuracy.ACC_EPSILON)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function CDAccuracy.recall_score refactored with the following changes:

Comment on lines -261 to +257
f = (1 + beta ** 2) * precision * recall / (precision * beta ** 2 + recall + CDAccuracy.ACC_EPSILON)
return f
return (
(1 + beta ** 2)
* precision
* recall
/ (precision * beta ** 2 + recall + CDAccuracy.ACC_EPSILON)
)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function CDAccuracy.f_score refactored with the following changes:

@sourcery-ai
Copy link
Author

sourcery-ai bot commented Nov 16, 2021

Sourcery Code Quality Report

✅  Merging this PR will increase code quality in the affected files by 0.02%.

Quality metrics Before After Change
Complexity 0.82 ⭐ 0.83 ⭐ 0.01 👎
Method Length 45.84 ⭐ 44.72 ⭐ -1.12 👍
Working memory 8.11 🙂 7.99 🙂 -0.12 👍
Quality 73.49% 🙂 73.51% 🙂 0.02% 👍
Other metrics Before After Change
Lines 280 275 -5
Changed files Quality Before Quality After Quality Change
Source/JackFramework/Evaluation/accuracy.py 73.49% 🙂 73.51% 🙂 0.02% 👍

Here are some functions in these files that still need a tune-up:

File Function Complexity Length Working Memory Quality Recommendation
Source/JackFramework/Evaluation/accuracy.py debug_main 1 ⭐ 378 ⛔ 11 😞 48.48% 😞 Try splitting into smaller methods. Extract out complex expressions
Source/JackFramework/Evaluation/accuracy.py SMAccuracy.d_1 1 ⭐ 117 🙂 13 😞 60.56% 🙂 Extract out complex expressions
Source/JackFramework/Evaluation/accuracy.py CDAccuracy.generate_confusion_matrix 0 ⭐ 65 🙂 10 😞 74.13% 🙂 Extract out complex expressions

Legend and Explanation

The emojis denote the absolute quality of the code:

  • ⭐ excellent
  • 🙂 good
  • 😞 poor
  • ⛔ very poor

The 👍 and 👎 indicate whether the quality has improved or gotten worse with this pull request.


Please see our documentation here for details on how these metrics are calculated.

We are actively working on this report - lots more documentation and extra metrics to come!

Help us improve this quality report!

@ZhiboRao ZhiboRao merged commit 57db26c into add_some_accuracy Nov 16, 2021
@ZhiboRao ZhiboRao deleted the sourcery/add_some_accuracy branch December 8, 2021 09:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant