-
Notifications
You must be signed in to change notification settings - Fork 45.5k
Description
Hi everyone.
I'm trying to develop an app to process sign language using React Native, Expo and @mediapipe/handpose, but the performance I get is too slow for what we need. The smartphone I'm testing with is the Samsung Galaxy S9, which is not at all a bad phone, and the FPS I get is around 2 ~ 6 fps;
The process is very simple: I load the handpose model -> Then with the CameraWithTensors from TensorFlow Js React Native I can create a loop to process the frames from the smartphone camera -> Inside this loop I'm able to call the model.estimateHands();
But when the estimatehands method is called, even not rendering the camera, the performance drops absurdly, making it almost useless for what we want because we need to apply another model to process the points generated; from the examples I've seen at the mediapipe website, the results I'm getting are very uncongruent. Do you think the problem is being caused by my code?
Here's a video on how it's looking like:
react-native-test.mp4
(Sending photos because, as you can see at the edit history I couldn't manage to format it properly 🤭)