-
Notifications
You must be signed in to change notification settings - Fork 74
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
no torch.nn.functional.normalize #5
Comments
update your pytorch |
unfortunately, I didn't look up it. Could you provide the link address? |
def normalize(input, p=2, dim=1, eps=1e-12): |
https://github.com/pytorch/pytorch/blob/master/torch/nn/functional.py#L883 from http://pytorch.org: |
the function was same that your provided, but the test accu was lower on CIFAR-10 set the parameter named 'True' of input.norm was deleted, program was implemented successfully, as following modified as that w = beta * normalize(w.view(w.size(0), -1)).view_as(w) + alpha * delta the training process below: and test_acc value more volatile. Train_loss was be 'nan' on the 87th iteration and test_acc reached 83.7%. From next iteration to end the test_acc became 10.0% , why? |
@eeric why did you modify the code? It might diverge for two reasons:
Please make sure you're running the code with latest pytorch and no modifications to the code. |
1.maybe, I attempted to modify lr=0.01 and delete F.normalize. 2.my edition accroding to script below: |
AttributeError: 'module' object has no attribute 'normalize'
File "/home/yq/work/face_class/diracnets/diracnet.py", line 97, in block
w = beta * F.normalize(w.view(w.size(0), -1)).view_as(w) + alpha * delta
The text was updated successfully, but these errors were encountered: