Skip to content

Conversation

drisspg
Copy link

@drisspg drisspg commented Dec 20, 2024

Stacked PRs:


hacked up

❯ torchfix auto_deprecate.py
auto_deprecate.py:8:15: TOR101 [*] Use of deprecated function torch.nn.functional.soft_margin_loss
--- /home/drisspg/meta/scripts/misc/auto_deprecate.py
+++ /home/drisspg/meta/scripts/misc/auto_deprecate.py
@@ -3,7 +3,7 @@
 
 import torch.nn.functional as F
 
 a = torch.tensor([1, 2, 3])
 b = torch.tensor([4, 5, 6])
-my_loss_fun = F.soft_margin_loss(a, b, size_average=True, reduce="some string")
+my_loss_fun = F.soft_margin_loss(a, b, reduction = 'mean')
 
Finished checking 1 files.
[*] 1 potentially fixable with the --fix option

stack-info: PR: #88, branch: drisspg/stack/1
@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Dec 20, 2024
@kit1980
Copy link
Contributor

kit1980 commented Dec 20, 2024

This will be complaining about my_loss_fun = F.soft_margin_loss(a, b, reduction = 'mean') ?

Probably better to create another visitor like "torchfix/visitors/deprecated_arguments/" or somethings instead of hacking on "torchfix/visitors/deprecated_symbols/", which is for when whole functions are deprecated.


import libcst as cst
from ...common import TorchVisitor, get_module_name
from torch.nn._reduction import legacy_get_string
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right now TorchFix doesn't depend on or use PyTorch.
Maybe we can add PyTorch to CI, but then which version?
Is it hard to copy or re-implement legacy_get_string?

@drisspg drisspg closed this Jul 30, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants