Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug of device and dtype of WgtScaleBatchNorm.std #115

Closed
3 of 13 tasks
fangwei123456 opened this issue Oct 15, 2022 · 0 comments · Fixed by #116
Closed
3 of 13 tasks

Bug of device and dtype of WgtScaleBatchNorm.std #115

fangwei123456 opened this issue Oct 15, 2022 · 0 comments · Fixed by #116

Comments

@fangwei123456
Copy link
Contributor

Objective of issue:

Lava DL version:

  • 0.3.0 (feature release)
  • 0.2.1 (bug fixes)
  • 0.2.0 (current version)
  • 0.1.2

Lava version:

  • 0.5.0
  • 0.4.0 (feature release)
  • 0.3.1 (bug fixes)
  • 0.3.0 (current version)
  • 0.2.0
  • 0.1.2

I'm submitting a ...

  • bug report
  • feature request
  • documentation request

Current behavior:

Traceback (most recent call last):
  File "/home/wfang/spikingjelly_dev/spikingjelly/test4.py", line 15, in <module>
    net(x)
  File "/home/wfang/anaconda3/envs/lava-env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/wfang/anaconda3/envs/lava-env/lib/python3.10/site-packages/lava/lib/dl/slayer/neuron/cuba.py", line 439, in forward
    _, voltage = self.dynamics(input)
  File "/home/wfang/anaconda3/envs/lava-env/lib/python3.10/site-packages/lava/lib/dl/slayer/neuron/cuba.py", line 365, in dynamics
    current = self.norm(current)
  File "/home/wfang/anaconda3/envs/lava-env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/wfang/anaconda3/envs/lava-env/lib/python3.10/site-packages/lava/lib/dl/slayer/neuron/norm.py", line 209, in forward
    std = self.std(var)
  File "/home/wfang/anaconda3/envs/lava-env/lib/python3.10/site-packages/lava/lib/dl/slayer/neuron/norm.py", line 170, in std
    return torch.ones(1) << torch.ceil(torch.log2(std)).clamp(
RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu!

Expected behavior:

  • No error is raised.

Steps to reproduce:

Run the following codes:

from lava.lib.dl import slayer
import torch

net = slayer.neuron.cuba.Neuron(
    threshold=1.,
    current_decay=1.,
    voltage_decay=0.,
    scale=1 << 6,
    norm=slayer.neuron.norm.WgtScaleBatchNorm
)
device = 'cuda:0'
net.to(device)
with torch.no_grad():
    x = torch.rand([4, 4, 4], device=device)
    net(x)

Related code:


Other information:

My env:

(lava-env) wfang@mlg-ThinkStation-P920:~$ conda list lava
# packages in environment at /home/wfang/anaconda3/envs/lava-env:
#
# Name                    Version                   Build  Channel
lava                      0.5.0              pyhd8ed1ab_0    conda-forge
lava-dl                   0.3.0              pyhd8ed1ab_0    conda-forge
(lava-env) wfang@mlg-ThinkStation-P920:~$ conda list torch
# packages in environment at /home/wfang/anaconda3/envs/lava-env:
#
# Name                    Version                   Build  Channel
pytorch                   1.12.1          cuda112py310h51fe464_200    conda-forge
torchvision               0.13.0          cuda112py310h453157a_0    conda-forge
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant