interpreter->Invoke(); Segmentation fault in c++ inference #64608
Labels
comp:lite
TF Lite related issues
stale
This label marks the issue/pr stale - to be closed automatically if no activity
stat:awaiting response
Status - Awaiting response from author
I am trying to deploy own object detection model in c++.
Once I run the code I am kept getting segmentation fault.
seems like
cerr << memcpy(interpreter->typed_input_tensor<uchar>(0), image.data, image.total() * image.elemSize())<< endl;
print out 0x7fff8c2fe580However it doesn't print 5 which is having problem from interpreter->Invoke()
This is what I am getting on debugger
#4 0x00007ffff7cda369 in tflite::optimized_ops::Conv(tflite::ConvParams const&, tflite::RuntimeShape const&, unsigned char const*, tflite::RuntimeShape const&, unsigned char const*, tflite::RuntimeShape const&, int const*, tflite::RuntimeShape const&, unsigned char*, tflite::RuntimeShape const&, unsigned char*, tflite::CpuBackendContext*) ()
#5 0x00007ffff7cdabc9 in void tflite::ops::builtin::conv::EvalQuantized<(tflite::ops::builtin::conv::KernelType)2>(TfLiteContext*, TfLiteNode*, TfLiteConvParams*, tflite::ops::builtin::conv::OpData*, TfLiteTensor*, TfLiteTensor*, TfLiteTensor*, TfLiteTensor*, TfLiteTensor*, TfLiteTensor*) ()
I am getting stuck and I am not sure how to fix this problem
The text was updated successfully, but these errors were encountered: