You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Running variational inference on models that include discrete choices currently results in an error for me:
> python examples/bernoulli_gmm.py
dataset loaded
/opt/conda/lib/python3.6/site-packages/torch/autograd/_functions/basic_ops.py:34: UserWarning: self and other not broadcastable, but have the same number of elements. Falling back to deprecated pointwise behavior.
return a.sub(b)
/opt/conda/lib/python3.6/site-packages/torch/autograd/_functions/basic_ops.py:63: UserWarning: self and other not broadcastable, but have the same number of elements. Falling back to deprecated pointwise behavior.
return a.div(b)
/opt/conda/lib/python3.6/site-packages/torch/autograd/_functions/basic_ops.py:17: UserWarning: self and other not broadcastable, but have the same number of elements. Falling back to deprecated pointwise behavior.
return a.add(b)
Traceback (most recent call last):
File "examples/bernoulli_gmm.py", line 77, in <module>
loss_sample = grad_step(i, data[i])
File "/data/pyro/infer/kl_qp.py", line 33, in __call__
return self.step(*args, **kwargs)
File "/data/pyro/infer/kl_qp.py", line 75, in step
loss.backward()
File "/opt/conda/lib/python3.6/site-packages/torch/autograd/variable.py", line 156, in backward
torch.autograd.backward(self, gradient, retain_graph, create_graph, retain_variables)
File "/opt/conda/lib/python3.6/site-packages/torch/autograd/__init__.py", line 98, in backward
variables, grad_variables, retain_graph)
File "/opt/conda/lib/python3.6/site-packages/torch/autograd/stochastic_function.py", line 15, in _do_backward
raise RuntimeError("differentiating stochastic functions requires "
RuntimeError: differentiating stochastic functions requires providing a reward
Running variational inference on models that include discrete choices currently results in an error for me:
Versions:
I observed the same behavior for PyTorch 0.1.x (without the UserWarnings).
The text was updated successfully, but these errors were encountered: