Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: Broadcasting failed #2

Open
dotaa opened this issue Feb 26, 2018 · 3 comments
Open

ValueError: Broadcasting failed #2

dotaa opened this issue Feb 26, 2018 · 3 comments

Comments

@dotaa
Copy link

dotaa commented Feb 26, 2018

hey,
I am using my own dataset and getting the following error,
Traceback (most recent call last):
File "train_mnist.py", line 177, in
main()
File "train_mnist.py", line 173, in main
trainer.run()
File "/home/ram/sanaa/local/lib/python2.7/site-packages/chainer/training/trainer.py", line 313, in run
six.reraise(*sys.exc_info())
File "/home/ram/sanaa/local/lib/python2.7/site-packages/chainer/training/trainer.py", line 299, in run
update()
File "/home/ram/sanaa/local/lib/python2.7/site-packages/chainer/training/updater.py", line 223, in update
self.update_core()
File "/home/ram/sanaa/local/lib/python2.7/site-packages/chainer/training/updater.py", line 234, in update_core
optimizer.update(loss_func, *in_arrays)
File "/home/ram/sanaa/local/lib/python2.7/site-packages/chainer/optimizer.py", line 536, in update
loss = lossfun(*args, **kwds)
File "/home/ram/sanaa/workspace/style_occasion/sanaa/train_conv/train_conv_multi/chainer-center-loss-master/model.py", line 49, in call
center_loss = self.center_loss_function(h, t)
File "/home/ram/sanaa/workspace/style_occasion/sanaa/train_conv/train_conv_multi/chainer-center-loss-master/center_loss.py", line 78, in call
return CenterLossFunction(self.alpha, self.num_classes)(x, t, self.centers)
File "/home/ram/sanaa/local/lib/python2.7/site-packages/chainer/function.py", line 235, in call
ret = node.apply(inputs)
File "/home/ram/sanaa/local/lib/python2.7/site-packages/chainer/function_node.py", line 245, in apply
outputs = self.forward(in_data)
File "/home/ram/sanaa/local/lib/python2.7/site-packages/chainer/function.py", line 135, in forward
return self._function.forward(inputs)
File "/home/ram/sanaa/workspace/style_occasion/sanaa/train_conv/train_conv_multi/chainer-center-loss-master/center_loss.py", line 39, in forward
y = xp.sum(xp.square(features - centers_batch)) / batch_size / 2
File "cupy/core/core.pyx", line 1185, in cupy.core.core.ndarray.sub
File "cupy/core/elementwise.pxi", line 798, in cupy.core.core.ufunc.call
File "cupy/core/core.pyx", line 2308, in cupy.core.core.broadcast.init
ValueError: Broadcasting failed
help would be appreciated!
thanks!

@shunk031
Copy link
Owner

shunk031 commented Feb 27, 2018

File "/home/ram/sanaa/workspace/style_occasion/sanaa/train_conv/train_conv_multi/chainer-center-loss-master/center_loss.py", line 39, in forward
y = xp.sum(xp.square(features - centers_batch)) / batch_size / 2

Please check the dimension of features and centers_batch. If the dimension mismatched, above error is raised.
This code is written in Python 3.5.1, so I recommend that you use Python3.5 or higher.

@dotaa
Copy link
Author

dotaa commented Feb 27, 2018

hello! yes, i solved that error.
its running but my cost values are exploding to NaN when i use center loss. when i disable it, the values are fine.
also, there is an issue with the script when passing
center_loss = self.center_loss_function(h, t, self.alpha_ratio)
it gives the error, Can take only 3 arguments, 4 given. so i removed self.alpha_ratio. is that why my values are going to Nan?
Can anybody help me with this?
Thanks!

@dotaa
Copy link
Author

dotaa commented Mar 1, 2018

I tried debugging. but i am not able to understand...i printed out the centers, and after sometime, the center values are really very high and they go to Nan. is this a bug in the code. I print out the centers and they look like,
screenshot from 2018-03-01 15 59 27
Can you help pls?

update:
i am using SGD momentum instead of adam. the cost values of just center loss is as high as 700.0 , now it isnt going to Nan.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants