Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot get any object detection #10

Closed
tmanchester opened this issue Dec 27, 2019 · 9 comments
Closed

Cannot get any object detection #10

tmanchester opened this issue Dec 27, 2019 · 9 comments

Comments

@tmanchester
Copy link

  • Raspberry Pi Deep PanTilt version:
  • Python version: 3.7
  • Operating System: Raspian Buster

Description

Running rpi-deep-pantilt detect I get the error ValueError: Failed to convert value into readable tensor.

What I Did

pi@raspberrypi:~ $ rpi-deep-pantilt detect
2019-12-27 21:07:25.972826: E tensorflow/core/platform/hadoop/hadoop_file_system.cc:132] HadoopFileSystem load error: libhdfs.so: cannot open shared object file: No such file or directory
WARNING:root:Limited tf.compat.v2.summary API due to missing TensorBoard installation.
WARNING:root:Limited tf.compat.v2.summary API due to missing TensorBoard installation.
Traceback (most recent call last):
  File "/home/pi/.local/bin/rpi-deep-pantilt", line 10, in <module>
    sys.exit(main())
  File "/home/pi/.local/lib/python3.7/site-packages/rpi_deep_pantilt/cli.py", line 107, in main
    cli()
  File "/usr/lib/python3/dist-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/usr/lib/python3/dist-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/usr/lib/python3/dist-packages/click/core.py", line 1137, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/lib/python3/dist-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/lib/python3/dist-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "/home/pi/.local/lib/python3.7/site-packages/rpi_deep_pantilt/cli.py", line 60, in detect
    run_detect(capture_manager, model)
  File "/home/pi/.local/lib/python3.7/site-packages/rpi_deep_pantilt/cli.py", line 31, in run_detect
    prediction = model.predict(frame)
  File "/home/pi/.local/lib/python3.7/site-packages/rpi_deep_pantilt/detect/ssd_mobilenet_v3_coco.py", line 282, in predict
    self.input_details[0]['index'], input_tensor)
  File "/home/pi/.local/lib/python3.7/site-packages/tensorflow_core/lite/python/interpreter.py", line 347, in set_tensor
    self._interpreter.SetTensor(tensor_index, value)
  File "/home/pi/.local/lib/python3.7/site-packages/tensorflow_core/lite/python/interpreter_wrapper/tensorflow_wrap_interpreter_wrapper.py", line 140, in SetTensor
    return _tensorflow_wrap_interpreter_wrapper.InterpreterWrapper_SetTensor(self, i, value)
ValueError: Failed to convert value into readable tensor.

@leigh-johnson
Copy link
Member

What version of TensorFlow do you have installed?

pip show tensorflow

@tmanchester
Copy link
Author

Name: tensorflow
Version: 1.14.0
Summary: TensorFlow is an open source machine learning framework for everyone.
Home-page: https://www.tensorflow.org/
Author: Google Inc.
Author-email: packages@tensorflow.org
License: Apache 2.0
Location: /home/pi/.local/lib/python3.7/site-packages
Requires: numpy, tensorboard, gast, tensorflow-estimator, wheel, grpcio, protobuf, google-pasta, six, termcolor, wrapt, keras-preprocessing, absl-py, astor, opt-einsum, keras-applications
Required-by:

@leigh-johnson
Copy link
Member

Ah, thank you for the report. Can you try uninstalling TensorFlow 1.x and installing this community-built TensorFlow 2.0 wheel?

pip uninstall tensorflow
pip install https://github.com/leigh-johnson/Tensorflow-bin/blob/master/tensorflow-2.0.0-cp37-cp37m-linux_armv7l.whl?raw=true

@tmanchester
Copy link
Author

Thanks, that's worked, but now I have a different problem: It doesn't seem to track objects and it always tries to pan and tilt out of range. My HDMI cable doesn't seem to be working at the moment so I don't actually know what it's detecting (I'll buy a new one tomorrow), but here's the output when I try and track a book right in front of the camera:

pi@raspberrypi:~ $ rpi-deep-pantilt track --label='book' --loglevel=DEBUG
WARNING:root:Limited tf.compat.v2.summary API due to missing TensorBoard installation.
WARNING:root:Limited tf.compat.v2.summary API due to missing TensorBoard installation.
WARNING:root:Limited tf.summary API due to missing TensorBoard installation.
INFO: Initialized TensorFlow Lite runtime.
INFO:root:loaded labels from /home/pi/.local/lib/python3.7/site-packages/rpi_deep_pantilt/data/mscoco_label_map.pbtxt
 {1: {'id': 1, 'name': 'person'}, 2: {'id': 2, 'name': 'bicycle'}, 3: {'id': 3, 'name': 'car'}, 4: {'id': 4, 'name': 'motorcycle'}, 5: {'id': 5, 'name': 'airplane'}, 6: {'id': 6, 'name': 'bus'}, 7: {'id': 7, 'name': 'train'}, 8: {'id': 8, 'name': 'truck'}, 9: {'id': 9, 'name': 'boat'}, 10: {'id': 10, 'name': 'traffic light'}, 11: {'id': 11, 'name': 'fire hydrant'}, 13: {'id': 13, 'name': 'stop sign'}, 14: {'id': 14, 'name': 'parking meter'}, 15: {'id': 15, 'name': 'bench'}, 16: {'id': 16, 'name': 'bird'}, 17: {'id': 17, 'name': 'cat'}, 18: {'id': 18, 'name': 'dog'}, 19: {'id': 19, 'name': 'horse'}, 20: {'id': 20, 'name': 'sheep'}, 21: {'id': 21, 'name': 'cow'}, 22: {'id': 22, 'name': 'elephant'}, 23: {'id': 23, 'name': 'bear'}, 24: {'id': 24, 'name': 'zebra'}, 25: {'id': 25, 'name': 'giraffe'}, 27: {'id': 27, 'name': 'backpack'}, 28: {'id': 28, 'name': 'umbrella'}, 31: {'id': 31, 'name': 'handbag'}, 32: {'id': 32, 'name': 'tie'}, 33: {'id': 33, 'name': 'suitcase'}, 34: {'id': 34, 'name': 'frisbee'}, 35: {'id': 35, 'name': 'skis'}, 36: {'id': 36, 'name': 'snowboard'}, 37: {'id': 37, 'name': 'sports ball'}, 38: {'id': 38, 'name': 'kite'}, 39: {'id': 39, 'name': 'baseball bat'}, 40: {'id': 40, 'name': 'baseball glove'}, 41: {'id': 41, 'name': 'skateboard'}, 42: {'id': 42, 'name': 'surfboard'}, 43: {'id': 43, 'name': 'tennis racket'}, 44: {'id': 44, 'name': 'bottle'}, 46: {'id': 46, 'name': 'wine glass'}, 47: {'id': 47, 'name': 'cup'}, 48: {'id': 48, 'name': 'fork'}, 49: {'id': 49, 'name': 'knife'}, 50: {'id': 50, 'name': 'spoon'}, 51: {'id': 51, 'name': 'bowl'}, 52: {'id': 52, 'name': 'banana'}, 53: {'id': 53, 'name': 'apple'}, 54: {'id': 54, 'name': 'sandwich'}, 55: {'id': 55, 'name': 'orange'}, 56: {'id': 56, 'name': 'broccoli'}, 57: {'id': 57, 'name': 'carrot'}, 58: {'id': 58, 'name': 'hot dog'}, 59: {'id': 59, 'name': 'pizza'}, 60: {'id': 60, 'name': 'donut'}, 61: {'id': 61, 'name': 'cake'}, 62: {'id': 62, 'name': 'chair'}, 63: {'id': 63, 'name': 'couch'}, 64: {'id': 64, 'name': 'potted plant'}, 65: {'id': 65, 'name': 'bed'}, 67: {'id': 67, 'name': 'dining table'}, 70: {'id': 70, 'name': 'toilet'}, 72: {'id': 72, 'name': 'tv'}, 73: {'id': 73, 'name': 'laptop'}, 74: {'id': 74, 'name': 'mouse'}, 75: {'id': 75, 'name': 'remote'}, 76: {'id': 76, 'name': 'keyboard'}, 77: {'id': 77, 'name': 'cell phone'}, 78: {'id': 78, 'name': 'microwave'}, 79: {'id': 79, 'name': 'oven'}, 80: {'id': 80, 'name': 'toaster'}, 81: {'id': 81, 'name': 'sink'}, 82: {'id': 82, 'name': 'refrigerator'}, 84: {'id': 84, 'name': 'book'}, 85: {'id': 85, 'name': 'clock'}, 86: {'id': 86, 'name': 'vase'}, 87: {'id': 87, 'name': 'scissors'}, 88: {'id': 88, 'name': 'teddy bear'}, 89: {'id': 89, 'name': 'hair drier'}, 90: {'id': 90, 'name': 'toothbrush'}}
INFO:root:initialized model ssd_mobilenet_v3_small_coco_2019_08_14

INFO:root:model inputs: [{'name': 'normalized_input_image_tensor', 'index': 374, 'shape': array([  1, 320, 320,   3]), 'dtype': <class 'numpy.float32'>, 'quantization': (0.0, 0)}]
 [{'name': 'normalized_input_image_tensor', 'index': 374, 'shape': array([  1, 320, 320,   3]), 'dtype': <class 'numpy.float32'>, 'quantization': (0.0, 0)}]
INFO:root:model outputs: [{'name': 'TFLite_Detection_PostProcess', 'index': 366, 'shape': array([ 1, 10,  4]), 'dtype': <class 'numpy.float32'>, 'quantization': (0.0, 0)}, {'name': 'TFLite_Detection_PostProcess:1', 'index': 367, 'shape': array([ 1, 10]), 'dtype': <class 'numpy.float32'>, 'quantization': (0.0, 0)}, {'name': 'TFLite_Detection_PostProcess:2', 'index': 368, 'shape': array([ 1, 10]), 'dtype': <class 'numpy.float32'>, 'quantization': (0.0, 0)}, {'name': 'TFLite_Detection_PostProcess:3', 'index': 369, 'shape': array([1]), 'dtype': <class 'numpy.float32'>, 'quantization': (0.0, 0)}]
 [{'name': 'TFLite_Detection_PostProcess', 'index': 366, 'shape': array([ 1, 10,  4]), 'dtype': <class 'numpy.float32'>, 'quantization': (0.0, 0)}, {'name': 'TFLite_Detection_PostProcess:1', 'index': 367, 'shape': array([ 1, 10]), 'dtype': <class 'numpy.float32'>, 'quantization': (0.0, 0)}, {'name': 'TFLite_Detection_PostProcess:2', 'index': 368, 'shape': array([ 1, 10]), 'dtype': <class 'numpy.float32'>, 'quantization': (0.0, 0)}, {'name': 'TFLite_Detection_PostProcess:3', 'index': 369, 'shape': array([1]), 'dtype': <class 'numpy.float32'>, 'quantization': (0.0, 0)}]
INFO:root:starting camera preview
INFO:root:Tracking book center_x 127 center_y 172
INFO:root:Tracking book center_x 310 center_y 133
INFO:root:Tracking book center_x 241 center_y 138
INFO:root:Tracking book center_x 271 center_y 116
INFO:root:Tracking book center_x 272 center_y 111
INFO:root:Tracking book center_x 66 center_y 176
INFO:root:Tracking book center_x 198 center_y 36
INFO:root:Tracking book center_x 264 center_y 106
DEBUG:PIL.PngImagePlugin:STREAM b'IHDR' 16 13
DEBUG:PIL.PngImagePlugin:STREAM b'IDAT' 41 1216
DEBUG:PIL.PngImagePlugin:STREAM b'IHDR' 16 13
DEBUG:PIL.PngImagePlugin:STREAM b'IDAT' 41 1216
DEBUG:PIL.PngImagePlugin:STREAM b'IHDR' 16 13
DEBUG:PIL.PngImagePlugin:STREAM b'IDAT' 41 1216
INFO:root:tilt_angle not in range 90.01197390556335
...
INFO:root:pan_angle not in range 101.58848788738251
INFO:root:tilt_angle not in range 103.40280756950378
INFO:root:pan_angle not in range 101.58848788738251
INFO:root:tilt_angle not in range 103.40280756950378
INFO:root:pan_angle not in range 101.58848788738251
INFO:root:tilt_angle not in range 103.40280756950378
INFO:root:pan_angle not in range 101.70742213726045
INFO:root:tilt_angle not in range 103.54436898231506
INFO:root:pan_angle not in range 101.70742213726045
INFO:root:tilt_angle not in range 103.54436898231506
INFO:root:pan_angle not in range 101.8247397184372
INFO:root:tilt_angle not in range 103.6637011051178
INFO:root:pan_angle not in range 101.8247397184372
INFO:root:tilt_angle not in range 103.6637011051178
INFO:root:pan_angle not in range 101.8247397184372
INFO:root:tilt_angle not in range 103.6637011051178
INFO:root:pan_angle not in range 101.8247397184372
INFO:root:tilt_angle not in range 103.6637011051178
INFO:root:pan_angle not in range 101.8247397184372
INFO:root:tilt_angle not in range 103.6637011051178
INFO:root:pan_angle not in range 101.8247397184372
INFO:root:tilt_angle not in range 103.6637011051178
INFO:root:pan_angle not in range 101.94816443920136
INFO:root:tilt_angle not in range 103.79902091026307
INFO:root:pan_angle not in range 101.94816443920136
INFO:root:tilt_angle not in range 103.79902091026307
INFO:root:pan_angle not in range 101.94816443920136
INFO:root:tilt_angle not in range 103.79902091026307
INFO:root:pan_angle not in range 101.94816443920136
INFO:root:tilt_angle not in range 103.79902091026307
INFO:root:pan_angle not in range 101.94816443920136
^C[INFO] You pressed `ctrl + c`! Exiting...

I omitted most of the tilt_angle not in range because I think you get the idea.

@leigh-johnson
Copy link
Member

When you run rpi-deep-pantilt test pantilt, do the pan/tilt servos move in a sinusoid from lower-left to upper right?

If the motion doesn't look like the .gif below, try swapping your servo connectors (e.g. unplug the connector from channel 1 and plug it into channel 2, plug the connector in channel 2 into channel 1)
https://miro.medium.com/max/1080/1*SOV1U1PAojGui2RohhhqPA.gif

@tmanchester
Copy link
Author

Yep, that works perfectly. I'll report back when I have a working HDMI connection so I can see what the camera is seeing.

INFO:root:starting camera preview
INFO:root:Tracking car center_x 159 center_y 149
INFO:root:Tracking car center_x 159 center_y 146
INFO:root:Tracking car center_x 159 center_y 138
DEBUG:PIL.PngImagePlugin:STREAM b'IHDR' 16 13
DEBUG:PIL.PngImagePlugin:STREAM b'IDAT' 41 1216
DEBUG:PIL.PngImagePlugin:STREAM b'IHDR' 16 13
DEBUG:PIL.PngImagePlugin:STREAM b'IDAT' 41 1216
INFO:root:Tracking car center_x 50 center_y 307
INFO:root:Tracking car center_x 160 center_y 159
INFO:root:Tracking car center_x 159 center_y 131
INFO:root:Tracking car center_x 79 center_y 79
INFO:root:tilt_angle not in range 90.2494945049286
INFO:root:tilt_angle not in range 90.2494945049286
INFO:root:tilt_angle not in range 90.45066661834719
INFO:root:tilt_angle not in range 90.45066661834719

I'm in a taxi currently so I tried to track a car behind me, seems to find the car at first but then slowly tilts up or down to 90 degrees and hits the tilt limit.

@tmanchester
Copy link
Author

tmanchester commented Dec 29, 2019

Okay I think I've found it:
I noticed that it seemed to be doing the opposite movement to what was expected, so in manager.py I changed

pan_angle = -1 * pan.value
tilt_angle =  tilt.value

on line 106 and 107 to

pan_angle = pan.value
tilt_angle = -1 * tilt.value

Now it doesn't try and pan or tilt past the limit, and it pans to follow me when I stand in front of it and move to the side. Seems to get stuck on INFO:root:starting camera preview when I define a label now though...

Edit: flipped the camera image in camera.py instead: figured it's the camera image that was upsidedown. Not sure if an upsidedown image affects detection but I thought I may as well.

@leigh-johnson
Copy link
Member

That makes sense, @tmanchester - thank you for the report!

A flipped camera will not affect detection, but the tracking angle will be incorrect (as you observed by inverting the pan / tilt signs).

I'll add a debug command for camera orientation in the next release. That should make it easier for anyone who runs into this issue. Glad you got it working!

@davefont
Copy link

Dear all,

Could you please help me .

In Tracking mode, with servos PAN & TILT working.

How can I come back to In RANGE ( both Pan and Tilt) when they come out and the message Pan_angle not in range -95.33 appear and finally both pan and tilt servos STOPED?

It happens when the CONDITION "in_range(pan_angle, SERVO_MIN, SERVO_MAX): is NOT fulfilled and jumps to function ELSE ( both for PAN and TILT)

else:
logging.info(f'pan_angle not in range {pan_angle}')

How can RESTART in possition 0 both servos and BEGIN again the object searching ?

Thanks a lot in advance,
David Garcia

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants