Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to use with local img #15

Closed
Naol-bm opened this issue Jan 6, 2024 · 8 comments
Closed

how to use with local img #15

Naol-bm opened this issue Jan 6, 2024 · 8 comments

Comments

@Naol-bm
Copy link

Naol-bm commented Jan 6, 2024

I am working with img labling project and need guidance on processing local images from the file system. How can I convert these local image files into tensors to pass them as inputs to the functions? and im using expo and it seems like @tensorflow/tfjs-react-native doesnt work with the currunt version of expo

@mrousavy
Copy link
Owner

mrousavy commented Jan 8, 2024

You need some way to get an Image's raw data as a number array (raw bytes).

Not sure if fetch can do that in react native?

@mrousavy
Copy link
Owner

I just remembered I have a lib for that, lol. Sometimes I forget what I build.

  1. Install react-native-blob-jsi-helper
  2. Install react-native-fast-tflite
const imageResource = await fetch("file://mylocalimage.png")
const blob = await imageResource.blob()
const arrayBuffer = getArrayBufferForBlob(blob)

// make sure the model is appropriately sized and in RGBA though. 
// otherwise, create a for loop that modifies the arrayBuffer so it fits the model.
await model.run([arrayBuffer])

@mashad6
Copy link

mashad6 commented Feb 15, 2024

@mrousavy await model.run([arrayBuffer]) returns array buffers

@mashad6
Copy link

mashad6 commented Feb 15, 2024

await ImagePicker.openPicker({ skipBackup: true, includeBase64: false, path: 'images', width: 300, height: 400, mediaType: "photo", }).then(async (image) => { console.log(image.path); const imageResource = await fetch(image.path) const blob = await imageResource.blob() const arrayBuffer = getArrayBufferForBlob(blob) const results = await model.run([arrayBuffer]) console.log("res: ", results); return })

@marcinkornek
Copy link

marcinkornek commented Feb 19, 2024

for me react-native-blob-jsi-helper was not working. The array length was not correct so I created simple library https://github.com/marcinkornek/react-native-image-to-rgb to convert image to rgb array which is working fine.

@mashad6
Copy link

mashad6 commented Feb 19, 2024

@marcinkornek how to give input format to this? like my model accepts 320 x 320 input

@marcinkornek
Copy link

marcinkornek commented Feb 19, 2024

@mashad6 I was using https://github.com/bamlab/react-native-image-resizer to resize image. You can do it with:

// resize image to given resolution (320x320, format JPEG, quality 100, etc)
// you can also use https://github.com/callstack/react-native-image-editor which also allows cropping or use any other library
const resizedPhoto = await ImageResizer.createResizedImage(
  `file://${photoPath}`,
    320,
    320,
    'JPEG',
    100,
    0,
    undefined,
    true,
    {mode: 'stretch'}, // when I was testing it, contain was not working so I used stretch which is not ideal
);

// convert resized image to rgb array
const convertedArray = await convertToRGB(resizedPhoto.uri);

// convert to Uint8 array buffer (but some models require float32 format)
const arrayBuffer = new Uint8Array(convertedArray);

// load tensorflow lite model
const model = await loadTensorflowModel(require('path/to/model.tflite'));

// run model with array buffer
const detectionResults = await model.run([arrayBuffer]);

btw I'll add example with react-native-fast-tflite to my library :)

@girraj96
Copy link

girraj96 commented May 27, 2024

@marcinkornek @mrousavy my tensor input shape is [1, 640, 640, 3] and output shape is [1, 7, 8400] and model require Float32Array buffer.

`
ImageResizer.createResizedImage(photo.path, 640, 640, 'JPEG', 100, 0)
.then(async respnse => {

      const convertedArray = await convertToRGB(respnse.uri);
      
      const arrayBuffer = new Float32Array(convertedArray);
      
      let outputs = model.runSync([arrayBuffer]);
      
         })`

But it shows an error that : Input Buffer size (2764800) does not match the Input Tensor's expected size (4915200)! Make sure to resize the input values accordingly.

how to do that resizing in this code. please suggest.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants