You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I bounced upon an issue while attempting to run the sashimi.py script, namely the LinearActivation function in the ./standalone/s4.py script does not accept the weight_norm argument upon initialization passed on through the **kwargs in the DownPool and UpPool classes.
The error trace:
File ".\PycharmProjects\state-spaces\sashimi\sashimi.py", line 450, in <module>
model = Sashimi(n_layers=2).cuda()
File ".\PycharmProjects\state-spaces\sashimi\sashimi.py", line 298, in __init__
d_layers.append(DownPool(H, expand, p))
File ".\PycharmProjects\state-spaces\sashimi\sashimi.py", line 28, in __init__
self.linear = LinearActivation(
File ".\PycharmProjects\state-spaces\src\models\sequence\ss\standalone\s4.py", line 137, in LinearActivation
linear = linear_cls(d_input, d_output, bias=bias, **kwargs)
TypeError: __init__() got an unexpected keyword argument 'weight_norm'
Looking a bit further I noticed that neither the nn.Conv1d (chosen in DownPool due to transposed = True) nor the nn.Linear that could be called within LinearActivation(), have the explicit weight_norm argument in Pytorch.
Am I overlooking something?
Python: 3.9
Pytorch: 1.12 (latest stable release)
Thanks a lot for publishing your code with the papers!
Cheers,
Bavo
The text was updated successfully, but these errors were encountered:
Thanks for pointing this out! It looks like I deleted some unused options in the V2.1 cleanup, but didn't track all of the other references. You should be able to just remove those arguments from the sashimi.py code. I am not planning to push small fixes like this yet as V3 is almost done.
Hi,
I bounced upon an issue while attempting to run the sashimi.py script, namely the LinearActivation function in the ./standalone/s4.py script does not accept the weight_norm argument upon initialization passed on through the **kwargs in the DownPool and UpPool classes.
The error trace:
Looking a bit further I noticed that neither the
nn.Conv1d
(chosen in DownPool due to transposed = True) nor thenn.Linear
that could be called withinLinearActivation()
, have the explicitweight_norm
argument in Pytorch.Am I overlooking something?
Python: 3.9
Pytorch: 1.12 (latest stable release)
Thanks a lot for publishing your code with the papers!
Cheers,
Bavo
The text was updated successfully, but these errors were encountered: