Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

IOS: example not work as expected #5

Closed
tarikdev1 opened this issue Feb 19, 2024 · 7 comments
Closed

IOS: example not work as expected #5

tarikdev1 opened this issue Feb 19, 2024 · 7 comments

Comments

@tarikdev1
Copy link

when i start your example an IOS , i get this error ;

Frame Processor Error: Regular javascript function '' cannot be shared. Try decorating the function with the 'worklet' keyword to allow the javascript function to be used as a worklet., js engine: VisionCamera

any help?

@tarikdev1 tarikdev1 changed the title example not work as expected IOS: example not work as expected Feb 19, 2024
@luicfrr
Copy link
Owner

luicfrr commented Feb 19, 2024

Did you add react-native-worklets-core (and processNestedWorklets if you're using reanimated) to your babel.config.js?

Can you please provide the full code you're using?

@tarikdev1
Copy link
Author

for sure
this mu babel.config.js

module.exports = {
  presets: ['module:metro-react-native-babel-preset'],
  plugins: [
    ['react-native-worklets-core/plugin'],
    [
      'react-native-reanimated/plugin',
    ],
  ],
};

demo code

import { 
  StyleSheet, 
  Text, 
  View 
} from 'react-native'
import { 
  useEffect, 
  useState 
} from 'react'
import {
  Camera,
  useCameraDevice,
  useFrameProcessor,
runAsync
} from 'react-native-vision-camera'
import { 
  detectFaces,
  DetectionResult 
} from 'react-native-vision-camera-face-detector'
import { Worklets } from 'react-native-worklets-core'

export default function App() {
  const device = useCameraDevice('front')

  useEffect(() => {
    (async () => {
      const status = await Camera.requestCameraPermission()
      console.log({ status })
    })()
  }, [device])

  const handleDetectionWorklet = Worklets.createRunInJsFn( (
    result: DetectionResult
  ) => { 
    console.log( 'detection result', result )
  })
  const frameProcessor = useFrameProcessor((frame) => {
    'worklet'
    runAsync(frame, () => {
      'worklet'
      detectFaces(
        frame,
        handleDetectionWorklet, {
          // detection settings
        }
      )
    })
  }, [handleDetectionWorklet])

  return (
    <View style={{ flex: 1 }}>
      {!!device? <Camera
        style={StyleSheet.absoluteFill}
        device={device}
        frameProcessor={frameProcessor}
      /> : <Text>
        No Device
      </Text>}
    </View>
  )
}

@luicfrr
Copy link
Owner

luicfrr commented Feb 19, 2024

The problem is you're missing processNestedWorklets in babel.
Thake a look here.

@tarikdev1
Copy link
Author

thanks,

this is my new babel config


module.exports = {
  presets: ['module:metro-react-native-babel-preset'],
  plugins: [
    ['react-native-worklets-core/plugin'],
    [
      'react-native-reanimated/plugin',
      {
        processNestedWorklets: true
      }
    ],
  ],
};

but i still have the same issue

Env:

  • "react-native-reanimated": "^3.4.2",
  • "react-native-vision-camera": "^3.9.0",
  • "react-native-vision-camera-face-detector": "^1.2.3",
  • "react-native": "0.72.3",

any other suggestions ?

@tarikdev1
Copy link
Author

i was need a --reset-cache, :)

thank 4 your help

@tarikdev1
Copy link
Author

one last question please,
i wanna plot a green faceRect on the detected face ;

  const onFacesDetected = (result: DetectionResult) => {
    if ( Object.keys(result.faces).length === 0) {
      setFaceDetected(false);
      return;
    }
    setFaceDetected(true);
    drawFaceRect(result.faces['0'].bounds.centerX, result.frame);
  };
  const deviceWidth = Dimensions.get('screen').width;
  const deviceHeight = Dimensions.get('screen').height;


  const drawFaceRect = (face: any, frame:any) => {
    rect.current?.setNativeProps({
      width: face.bounds.width * (deviceWidth/frame.width),
      height: face.bounds.height * (deviceHeight/frame.height),
      top: face.bounds.top * (deviceHeight/frame.height),
      left: face.bounds.left * (deviceWidth/frame.width),
    });
  };

is this code is correct

@luicfrr
Copy link
Owner

luicfrr commented Feb 19, 2024

In my local tests (and knowing that vision camera still have an orientation issue) I had to add some platform specific code, here's my current code:

type FacePosType = {
  faceW: number
  faceH: number
  faceX: number
  faceY: number
}
function calcFacePosition(
    bounds: Bounds,
    frame: FrameData
  ): FacePosType {
    const orientation = ( () => {
      switch ( frame.orientation ) {
        case 'portrait': return 0
        case 'landscape-left': return 90
        case 'portrait-upside-down': return 180
        case 'landscape-right': return 270
      }
    } )()
    const degrees = ( orientation - 90 + 360 ) % 360
    let scaleX = 0
    let scaleY = 0

    if ( !isIos && (
      degrees === 90 ||
      degrees === 270
    ) ) {
      // scale is inverted because of vison camera orientation issue
      scaleX = windowWidth / frame.height
      scaleY = windowHeight / frame.width
    } else {
      scaleX = windowWidth / frame.width
      scaleY = windowHeight / frame.height
    }

    const faceW = ( bounds.right - bounds.left ) * scaleX
    const faceH = ( bounds.bottom - bounds.top ) * scaleY
    const faceX = ( () => {
      const xPos = bounds.left * scaleX
      if ( isIos ) return xPos
      // invert x pos on android 
      return windowWidth - ( xPos + faceW )
    } )()

    return {
      faceW,
      faceH,
      // horizonally face center
      faceX: faceX + ( faceW / 2 ),
      // vertically face center
      faceY: bounds.top * scaleY
    }
  }

Closing issue as original problem is solved.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants