New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
blob size exceeds INT_MAX: did you have had this error? #5
Comments
This is common problem with Caffe and/or hdf5, see BVLC/caffe#3084. |
@holgerroth I installed opencl-caffe, and could solve the issue with that error. I have one volume in each
The input shape of the first volume in HDF5 dataset is 1 1 104 281 389 (NxCxDxHxW), when it reaches to the Concat layer (concat_d2c_u2a-b), it is raising an error, which it is inputs are with the following two shapes: 1 256 19 64 91 scaled2c_relu_d2c_0_split_1 |
Can you share your prototxt file as well please? |
Hi, I tried to run the code in 3D-Unet with your pretrained model.
I am asking this question
Did you face with this error? I tried to apply your pre-trained model on my data. If yes, could you please guide me? I have installed 3d-Unet patch on caffe. Then I tried to install opencl-caffe that I was not successful. Thanks
The text was updated successfully, but these errors were encountered: