New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
mean file from caffe model #1987
Comments
Please refer to this article: https://stackoverflow.com/questions/41512845/what-does-caffe-do-with-the-mean-binary-file Mean file seems like is a way to do normalization, if it is not part of your ONNX file, then you need to do it in your pre-processing step; if it's already in your converted ONNX file, then you do not need the file anymore. |
@anguoyang I use this to read the mean files and export in ONNX: def read_blob(meanmodel):
"""Read blob
:param meanmodel: path to mean blob
"""
import caffe
import caffe.proto.caffe_pb2 as caffe_pb2
mean_blob = caffe_pb2.BlobProto()
with open(meanmodel, 'rb') as fp:
mean_blob.ParseFromString(fp.read())
return mean_blob
class FloatSubtract(nn.Module):
def __init__(self, dc):
"""Subtract a constant from the tensor
dc could be anything that can be subtracted,
it could be per-pixel mean (like the landmark example)
or per-channel mean (like the celeb) example.
This is a simple subtraction, fewer operations means smaller network (no need to divide by 1.0 for example)
:param dc: the constant to subtract
"""
super(FloatSubtract, self).__init__()
self.dc = dc
def extra_repr(self):
"""Extra information
"""
return 'dc.shape={}'.format(self.dc.shape)
def forward(self, x):
return x.float() - self.dc.to(x.device)
mean_blob = read_blob(meanfile)
pixel_mean = np.array(mean_blob.data).astype('float32')
pixel_mean.resize(mean_blob.channels, mean_blob.height, mean_blob.width)[np.newaxis]
dc = torch.from_numpy(pixel_mean)
# Just make sure dimensions can broadcast in `dc` with the input image
dep_model = nn.Sequential(FloatSubtract(dc), model) |
Hi@dashesy , thank you so much, this is what I want! |
Hi all,
I converted a caffe model to onnx successfully, and want to infer it with onnxruntime, but the problem is when I used caffe, I have to use the mean file:mean.binaryproto, but there is no document on how to use the mean file when infer with onnxruntime:
sess = nxrun.InferenceSession("./deploy.onnx")
to e.g: sess = nxrun.InferenceSession("./deploy.onnx", "mean.binaryproto")?
how can I add the mean file when infer the model? thank you
The text was updated successfully, but these errors were encountered: