Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bayesian GRU #87

Open
Anusha851 opened this issue Jun 14, 2021 · 2 comments
Open

Bayesian GRU #87

Anusha851 opened this issue Jun 14, 2021 · 2 comments

Comments

@Anusha851
Copy link

Anusha851 commented Jun 14, 2021

I get the following error for Bayesian GRU implementation.


ValueError Traceback (most recent call last)
in
9 labels=labels.to(device),
10 criterion=criterion,
---> 11 sample_nbr=3)
12 loss.backward()
13 optimizer.step()

/opt/conda/lib/python3.7/site-packages/blitz/utils/variational_estimator.py in sample_elbo(self, inputs, labels, criterion, sample_nbr, complexity_cost_weight)
63 loss = 0
64 for _ in range(sample_nbr):
---> 65 outputs = self(inputs)
66 loss += criterion(outputs, labels)
67 loss += self.nn_kl_divergence() * complexity_cost_weight

/opt/conda/lib/python3.7/site-packages/torch/nn/modules/module.py in _call_impl(self, *input, **kwargs)
725 result = self._slow_forward(*input, **kwargs)
726 else:
--> 727 result = self.forward(*input, **kwargs)
728 for hook in itertools.chain(
729 _global_forward_hooks.values(),

in forward(self, encodings, hidden_states, sharpen_loss)
17
18 #pass the inputs to the model
---> 19 x,t = self.bGRU1(encodings,hidden_states = None, sharpen_loss = None)
20 x,t = self.bGRU2(x,t)
21 return x

/opt/conda/lib/python3.7/site-packages/torch/nn/modules/module.py in _call_impl(self, *input, **kwargs)
725 result = self._slow_forward(*input, **kwargs)
726 else:
--> 727 result = self.forward(*input, **kwargs)
728 for hook in itertools.chain(
729 _global_forward_hooks.values(),

/opt/conda/lib/python3.7/site-packages/blitz/modules/gru_bayesian_layer.py in forward(self, x, hidden_states, sharpen_loss)
221 sharpen_loss = None
222
--> 223 return self.forward_(x, hidden_states, sharpen_loss)
224

/opt/conda/lib/python3.7/site-packages/blitz/modules/gru_bayesian_layer.py in forward_(self, x, hidden_states, sharpen_loss)
135
136 #Assumes x is of shape (batch, sequence, feature)
--> 137 bs, seq_sz, _ = x.size()
138 hidden_seq = []
139

ValueError: not enough values to unpack (expected 3, got 2)

For this line of code:

for i, (datapoints, labels) in enumerate(train_dataloader):
optimizer.zero_grad()

    loss = gru.sample_elbo(inputs=datapoints.to(device),
                       labels=labels.to(device),
                       criterion=criterion,
                       sample_nbr=3)

My datapoints have shape torch.Size([4, 768]). How am I expected to reshape it? Please advise. Thanks.

@donhauser
Copy link

You can add a new axis by using None as selector for your tensor/array.

Like this you can "prepend" an extra dimension to 2-D data in numpy and torch:

# datapoints have shape [4, 768]
datapoints = datapoints[None, :, :]
# datapoints now have shape [1, 4, 768]

By swapping None with :, you could also reshape to [4, 1, 768] or [4, 768, 1].
If you need to swap 4 and 768 for whatever reason, have a look at .transpose()

Source: https://sparrow.dev/adding-a-dimension-to-a-tensor-in-pytorch/

@Anusha851
Copy link
Author

Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants