Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature extraction(using V2 and V3) #12

Closed
inverai opened this issue May 7, 2019 · 3 comments
Closed

feature extraction(using V2 and V3) #12

inverai opened this issue May 7, 2019 · 3 comments

Comments

@inverai
Copy link

inverai commented May 7, 2019

---> 2 face_feature_extractor = MobileFaceFeatureExtractor(model_file, epoch, batch_size, context, gpu_id)

in init(self, model_file, epoch, batch_size, context, gpu_id)
10 self.model.bind(for_training = False, data_shapes=[('data', (self.batch_size, 1, 100, 100))])
11 sym, arg_params, aux_params = mxnet.model.load_checkpoint(self.model_file, self.epoch)
---> 12 self.model.set_params(arg_params, aux_params)
13
14

D:\UserInfo\DownDir\Conda\envs\tensor\lib\site-packages\mxnet\module\module.py in set_params(self, arg_params, aux_params, allow_missing, force_init, allow_extra)
348 self.init_params(initializer=None, arg_params=arg_params, aux_params=aux_params,
349 allow_missing=allow_missing, force_init=force_init,
--> 350 allow_extra=allow_extra)
351 return
352

D:\UserInfo\DownDir\Conda\envs\tensor\lib\site-packages\mxnet\module\module.py in init_params(self, initializer, arg_params, aux_params, allow_missing, force_init, allow_extra)
307 for name, arr in sorted(self._arg_params.items()):
308 desc = InitDesc(name, attrs.get(name, None))
--> 309 _impl(desc, arr, arg_params)
310
311 for name, arr in sorted(self._aux_params.items()):

D:\UserInfo\DownDir\Conda\envs\tensor\lib\site-packages\mxnet\module\module.py in _impl(name, arr, cache)
295 # just in case the cached array is just the target itself
296 if cache_arr is not arr:
--> 297 cache_arr.copyto(arr)
298 else:
299 if not allow_missing:

D:\UserInfo\DownDir\Conda\envs\tensor\lib\site-packages\mxnet\ndarray\ndarray.py in copyto(self, other)
2072 warnings.warn('You are attempting to copy an array to itself', RuntimeWarning)
2073 return False
-> 2074 return _internal._copyto(self, out=other)
2075 elif isinstance(other, Context):
2076 hret = NDArray(_new_alloc_handle(self.shape, other, True, self.dtype))

D:\UserInfo\DownDir\Conda\envs\tensor\lib\site-packages\mxnet\ndarray\register.py in _copyto(data, out, name, **kwargs)

D:\UserInfo\DownDir\Conda\envs\tensor\lib\site-packages\mxnet_ctypes\ndarray.py in _imperative_invoke(handle, ndargs, keys, vals, out)
90 c_str_array(keys),
91 c_str_array([str(s) for s in vals]),
---> 92 ctypes.byref(out_stypes)))
93
94 if original_output is not None:

D:\UserInfo\DownDir\Conda\envs\tensor\lib\site-packages\mxnet\base.py in check_call(ret)
250 """
251 if ret != 0:
--> 252 raise MXNetError(py_str(_LIB.MXGetLastError()))
253
254

MXNetError: [21:47:41] c:\jenkins\workspace\mxnet-tag\mxnet\src\operator\tensor../elemwise_op_common.h:135: Check failed: assign(&dattr, vec.at(i)) Incompatible attr in node at 0-th output: expected [32,3,3,3], got [32,1,3,3]

@inverai
Copy link
Author

inverai commented May 7, 2019

self.model.bind(for_training = False, data_shapes=[('data', (self.batch_size, 3, 100, 100))])

change 1 to 3, then :
---> 22 self.model.forward(Batch([mxnet.nd.array(batch_data)]))
MXNetError: [21:53:02] C:\Jenkins\workspace\mxnet-tag\mxnet\src\executor\graph_executor.cc:858: Shape of unspecifie arg: conv1_weight changed. This can cause the new executor to not share parameters with the old one. Please check for error in network.If this is intended, set partial_shaping=True to suppress this warning.

@becauseofAI
Copy link
Owner

@inverai
Can you run the command python3 get_face_feature_v2_mxnet.py or python3 get_face_feature_v3_mxnet.py directly and can you get the right result?

@inverai
Copy link
Author

inverai commented May 13, 2019

...

@inverai inverai closed this as completed May 13, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants