New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Squeezenet Tutorial: Predictor outputs initialization information and exits #1352

Open
mattm401 opened this Issue Oct 18, 2017 · 6 comments

Comments

Projects
None yet
2 participants
@mattm401

mattm401 commented Oct 18, 2017

Similar to #799, I am trying to run through the Loading Pre-Trained Models Tutorial with GPU/CUDA/Windows10: https://caffe2.ai/docs/tutorial-loading-pre-trained-models.html

The following code executes:

`device_opts = core.DeviceOption(caffe2_pb2.CUDA, 0)
workspace.FeedBlob('data', img, device_option=device_opts)

init_def = caffe2_pb2.NetDef()
with open(INIT_NET, 'rb') as f:
init_def.ParseFromString(f.read())
init_def.device_option.CopyFrom(device_opts)
workspace.RunNetOnce(init_def.SerializeToString())

net_def = caffe2_pb2.NetDef()
with open(PREDICT_NET, 'rb') as f:
net_def.ParseFromString(f.read())
net_def.device_option.CopyFrom(device_opts)
workspace.CreateNet(net_def.SerializeToString())

print 'Running net...'
p = workspace.Predictor(init_def, net_def)`

However, the system outputs a bunch of data about the model/predictor and then immediately exits without running the rest of the code or providing any additional information as to why the system has exited. Has anyone seen this behavior before?

image

@mattm401

This comment has been minimized.

Show comment
Hide comment
@mattm401

mattm401 Oct 18, 2017

tried running with core.DeviceScope(device) set to CPU and it produced the same output--so potentially unrelated to GPU.

mattm401 commented Oct 18, 2017

tried running with core.DeviceScope(device) set to CPU and it produced the same output--so potentially unrelated to GPU.

@mattm401

This comment has been minimized.

Show comment
Hide comment
@mattm401

mattm401 Oct 18, 2017

Okay, I was able to capture the error:

ERROR __main__ Incompatible constructor arguments. The following argument types are supported: 1. caffe2.python.caffe2_pybind11_state_gpu.Predictor(str, str) Invoked with: name: "squeezenet_init" op { output: "conv1_w" name: "" type: "GivenTensorFill" arg { name: "shape" ints: 64 ints: 3 ints: 3 ints: 3 } arg { name: "values"
floats: 0.257234632969

mattm401 commented Oct 18, 2017

Okay, I was able to capture the error:

ERROR __main__ Incompatible constructor arguments. The following argument types are supported: 1. caffe2.python.caffe2_pybind11_state_gpu.Predictor(str, str) Invoked with: name: "squeezenet_init" op { output: "conv1_w" name: "" type: "GivenTensorFill" arg { name: "shape" ints: 64 ints: 3 ints: 3 ints: 3 } arg { name: "values"
floats: 0.257234632969

@mattm401

This comment has been minimized.

Show comment
Hide comment
@mattm401

mattm401 Oct 19, 2017

Running on CPU mode, the output is still the same but I cannot capture the error...

mattm401 commented Oct 19, 2017

Running on CPU mode, the output is still the same but I cannot capture the error...

@mattm401 mattm401 changed the title from Squeezenet Tutorial: Predictor GPU *Outputs predictor information and exits* to Squeezenet Tutorial: Predictor outputs initialization information and exits Oct 19, 2017

@mattm401

This comment has been minimized.

Show comment
Hide comment
@mattm401

mattm401 Oct 19, 2017

Worked this out in CPU Model, scrapped the .pb I had and re-pulled from the github models repo (updated init to exec in your code); however, in GPU mode. I get an error when switching to GPU/CUDA

blob->template IsType<TensorCPU>(). Blob is not a CPU Tensor: data

Does this suggest that Predictor only works on the CPU?

mattm401 commented Oct 19, 2017

Worked this out in CPU Model, scrapped the .pb I had and re-pulled from the github models repo (updated init to exec in your code); however, in GPU mode. I get an error when switching to GPU/CUDA

blob->template IsType<TensorCPU>(). Blob is not a CPU Tensor: data

Does this suggest that Predictor only works on the CPU?

@CriCL

This comment has been minimized.

Show comment
Hide comment
@CriCL

CriCL Jan 27, 2018

hi @mattm401 could you solve "Blob is not a CPU tensor: data" issue?

I'm facing the same situation

CriCL commented Jan 27, 2018

hi @mattm401 could you solve "Blob is not a CPU tensor: data" issue?

I'm facing the same situation

@mattm401

This comment has been minimized.

Show comment
Hide comment
@mattm401

mattm401 Feb 7, 2018

I haven't had a chance to look into this particular issue. I ended up getting my application running in CPU mode and speed hasn't been an issue in my application.

mattm401 commented Feb 7, 2018

I haven't had a chance to look into this particular issue. I ended up getting my application running in CPU mode and speed hasn't been an issue in my application.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment