Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: CORS fetching model face-detection #908

Closed
sebassdominguez opened this issue Apr 8, 2024 · 14 comments
Closed

Bug: CORS fetching model face-detection #908

sebassdominguez opened this issue Apr 8, 2024 · 14 comments
Labels

Comments

@sebassdominguez
Copy link

What happened?

I am using:

@tensorflow-models/face-detection & @mediapipe/face_detection

All worked fine until this morning in local and all deployed environmets. But now we are receiving this CORS error:

Access to fetch at 'https://www.kaggle.com/models/mediapipe/face-detection/frameworks/tfJs/variations/short/versions/1/model.json?tfjs-format=file&tfhub-redirect=true' (redirected from 'https://tfhub.dev/mediapipe/tfjs-model/face_detection/short/1/model.json?tfjs-format=file') from origin 'https://localhost:3000' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.

Relevant code

import * as faceDetection from '@tensorflow-models/face-detection';
import { FaceDetector } from '@tensorflow-models/face-detection';
import { MediaPipeFaceDetectorModelConfig } from '@tensorflow-models/face-detection/dist/mediapipe/types';

import '@mediapipe/face_detection';


// package.json
...
 "dependencies": {
   "@mediapipe/face_detection": "^0.4.1646425229",
   "@tensorflow-models/face-detection": "^1.0.1",
    "@tensorflow/tfjs": "^4.2.0",
    "@tensorflow/tfjs-backend-wasm": "^4.2.0",
    "@tensorflow/tfjs-core": "^4.14.0",
..}

Relevant log output

`Access to fetch at 'https://www.kaggle.com/models/mediapipe/face-detection/frameworks/tfJs/variations/short/versions/1/model.json?tfjs-format=file&tfhub-redirect=true' (redirected from 'https://tfhub.dev/mediapipe/tfjs-model/face_detection/short/1/model.json?tfjs-format=file') from origin 'https://localhost:3000' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.`

tensorflow_hub Version

0.12.0 (latest stable release)

TensorFlow Version

2.8 (latest stable release)

Other libraries

No response

Python Version

3.x

OS

Linux

@randallb
Copy link

randallb commented Apr 8, 2024

Same issue for: https://tfhub.dev/google/tfjs-model/movenet/singlepose/lightning/4/model.json?tfjs-format=file

@sgidon
Copy link

sgidon commented Apr 9, 2024

Same issue for: https://tfhub.dev/mediapipe/tfjs-model/facemesh/1/default/1/model.json?tfjs-format=file

@deepakiam
Copy link

@pranjalieedee
Copy link

pranjalieedee commented Apr 9, 2024

Facing the same issue.
It is BLOCKING and HIGHEST PRIORITY for us.

Facing the exact same issue as the Original Poster of this bug.
image

Relevant code:

import * as poseDetection from "@tensorflow-models/pose-detection";
import "@tensorflow/tfjs-backend-webgl";
import * as tf from "@tensorflow/tfjs-core";

PACKAGE.JSON:

    "@mediapipe/pose": "^0.5.1675469404",
    "@tensorflow-models/pose-detection": "^2.1.3",
    "@tensorflow/tfjs": "^4.17.0",
    "@tensorflow/tfjs-backend-webgl": "^4.17.0",
    "@tensorflow/tfjs-backend-webgpu": "^4.17.0",
    "@tensorflow/tfjs-core": "^4.17.0",

Any guidance on how to solve this?

@cvikir
Copy link

cvikir commented Apr 9, 2024

We are facing the same issues with blazeface library, it is a site breaking bug for us too. I would really appreciate any help with this.

Access to fetch at 'https://www.kaggle.com/models/tensorflow/blazeface/frameworks/tfJs/variations/default/versions/1/model.json?tfjs-format=file&tfhub-redirect=true' (redirected from 'https://tfhub.dev/tensorflow/tfjs-model/blazeface/1/default/1/model.json?tfjs-format=file') from origin 'https://www.irozhlas.cz' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled. 
        
        
       Failed to load resource: the server responded with a status of 404 (Not Found)
cropeditor:1 Access to fetch at 'https://www.kaggle.com/models/tensorflow/blazeface/frameworks/tfJs/variations/default/versions/1/model.json?tfjs-format=file&tfhub-redirect=true' (redirected from 'https://tfhub.dev/tensorflow/tfjs-model/blazeface/1/default/1/model.json?tfjs-format=file') from origin 'https://www.irozhlas.cz' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
js__dSJKmLCh4BpFgjlNf9Iac84ycwL8EphiuirtOMXoq8g__VtZTTyFHsrjhjaQGg5eL4fFzR50dH4-D8OnrCt4LnuM__MDVaI8Jh7c3M7XGde2fgB9UFepjZlSUMn2Nm5CbCILk.js:17 Uncaught (in promise) TypeError: Failed to fetch
    at e.<anonymous> (cropeditor.worker.0.js:2:29895)
    at e.t [as fetch] (cropeditor.worker.0.js:2:29798)
    at e.<anonymous> (cropeditor.worker.0.js:2:41927)
    at d (cropeditor.worker.0.js:2:402762)
    at Generator._invoke (cropeditor.worker.0.js:2:402550)
    at Generator.next (cropeditor.worker.0.js:2:403187)
    at r (cropeditor.worker.0.js:2:167011)
    at c (cropeditor.worker.0.js:2:167214)
    at cropeditor.worker.0.js:2:167273
    at new Promise (<anonymous>)

@Anup-1827
Copy link

@KeijiBranshi
Copy link
Collaborator

Hi all, we're looking into this issue. Thanks for your patience.

@sebassdominguez
Copy link
Author

sebassdominguez commented Apr 9, 2024

For those who are with this problem, let me tell you I fixed doing that:

  1. Downloaded model from https://www.kaggle.com/models/mediapipe/face-detection/frameworks/tfJs/variations/short/versions/1/
  2. Served the two files in my own domain. Get the path to the model.json file
  3. Use this path in detectorModelUrl key.
import { MediaPipeFaceDetectorTfjsModelConfig } from '@tensorflow-models/face-detection/dist/tfjs/types';
import * as faceDetection from '@tensorflow-models/face-detection';
import '@mediapipe/face_detection';


const { MediaPipeFaceDetector } = faceDetection.SupportedModels;
const detectorConfig: MediaPipeFaceDetectorTfjsModelConfig = {
    runtime: 'tfjs',
    maxFaces: 2,
    detectorModelUrl:
       'https://[yourdomain]/model.json',
};
faceModel.current = await faceDetection.createDetector(
    MediaPipeFaceDetector,
    detectorConfig,
);

@pranjalieedee
Copy link

pranjalieedee commented Apr 9, 2024

For those who are with this problem, let me tell you I fixed doing that:

  1. Downloaded model from https://www.kaggle.com/models/mediapipe/face-detection/frameworks/tfJs/variations/short/versions/1/
  2. Served the two files in my own domain. Get the path to the model.json file
  3. Use this path in detectorModelUrl key.
import { MediaPipeFaceDetectorTfjsModelConfig } from '@tensorflow-models/face-detection/dist/tfjs/types';
import * as faceDetection from '@tensorflow-models/face-detection';
import '@mediapipe/face_detection';


const { MediaPipeFaceDetector } = faceDetection.SupportedModels;
const detectorConfig: MediaPipeFaceDetectorTfjsModelConfig = {
    runtime: 'tfjs',
    maxFaces: 2,
    detectorModelUrl:
       'https://[yourdomain]/model.json',
};
faceModel.current = await faceDetection.createDetector(
    MediaPipeFaceDetector,
    detectorConfig,
);

I tried doing the same.

const detectorConfig = {
modelType: poseDetection.movenet.modelType.SINGLEPOSE_LIGHTNING,
modelUrl:
};

I also tried writing a different API that fetches the model from kaggle url, adds CORS headers and sends to front end.

router.get("/modelWrapper", async (req, res) => {
  try {
    console.log("********** poseDetector WRAPPER *************");
    const response = await fetch('https://www.kaggle.com/models/google/movenet/frameworks/tfJs/variations/singlepose-lightning/versions/4/model.json?tfjs-format=file&tfhub-redirect=true');
    
    console.log("********** RESPONSE WRAPPER *************");
    // Check if response is successful
    if (!response.ok) {
      throw new Error('Failed to fetch resource');
    }

    // Get response body as JSON
    const data = await response.json();

    // Add CORS headers
    res.setHeader('Access-Control-Allow-Origin', '*');
    console.log("********** HEADER WRAPPER *************");
    // Send response back to frontend
    res.json(data);
  } catch (error) {
    console.error('Error:', error.message);
    res.status(500).send('Internal server error');
  }
});

However, in both cases, the weights need to be loaded from the same kaggle URL and I am not finding how to download it?
group1-shard2of2.bin `group2-shard2of2.bin

Any suggestions on fixing this would be helpful. The issue is happening in our live Production environment .

@KeijiBranshi
Copy link
Collaborator

Hi all, we rolled out a fix for the issue. You should hopefully start to see the model load from https://tfhub.dev URLs again soon. Due to varying layers of caching, propagation of the fix may be client dependent. So keep checking, but keep us posted if the URLs continue to fail.

Thanks again for your patience!

@KeijiBranshi
Copy link
Collaborator

For those who are with this problem, let me tell you I fixed doing that:

  1. Downloaded model from https://www.kaggle.com/models/mediapipe/face-detection/frameworks/tfJs/variations/short/versions/1/
  2. Served the two files in my own domain. Get the path to the model.json file
  3. Use this path in detectorModelUrl key.
import { MediaPipeFaceDetectorTfjsModelConfig } from '@tensorflow-models/face-detection/dist/tfjs/types';
import * as faceDetection from '@tensorflow-models/face-detection';
import '@mediapipe/face_detection';


const { MediaPipeFaceDetector } = faceDetection.SupportedModels;
const detectorConfig: MediaPipeFaceDetectorTfjsModelConfig = {
    runtime: 'tfjs',
    maxFaces: 2,
    detectorModelUrl:
       'https://[yourdomain]/model.json',
};
faceModel.current = await faceDetection.createDetector(
    MediaPipeFaceDetector,
    detectorConfig,
);

I tried doing the same.

const detectorConfig = { modelType: poseDetection.movenet.modelType.SINGLEPOSE_LIGHTNING, modelUrl: };

I also tried writing a different API that fetches the model from kaggle url, adds CORS headers and sends to front end.

router.get("/modelWrapper", async (req, res) => {
  try {
    console.log("********** poseDetector WRAPPER *************");
    const response = await fetch('https://www.kaggle.com/models/google/movenet/frameworks/tfJs/variations/singlepose-lightning/versions/4/model.json?tfjs-format=file&tfhub-redirect=true');
    
    console.log("********** RESPONSE WRAPPER *************");
    // Check if response is successful
    if (!response.ok) {
      throw new Error('Failed to fetch resource');
    }

    // Get response body as JSON
    const data = await response.json();

    // Add CORS headers
    res.setHeader('Access-Control-Allow-Origin', '*');
    console.log("********** HEADER WRAPPER *************");
    // Send response back to frontend
    res.json(data);
  } catch (error) {
    console.error('Error:', error.message);
    res.status(500).send('Internal server error');
  }
});

However, in both cases, the weights need to be loaded from the same kaggle URL and I am not finding how to download it? group1-shard2of2.bin `group2-shard2of2.bin

Any suggestions on fixing this would be helpful. The issue is happening in our live Production environment .

@pranjalieedee I don't know exactly what issue you're hitting without logs and/or a repro of your dev environment. But I know for self-hosting the model, you will need to host all the tfjs model files (model.json and *.bin) at the same path on your web app infrastructure. See the relevant docs on the expected TF.js filepaths here:

The intermediate API code snippet you shared only fetches the model.json file:

const response = await fetch('https://www.kaggle.com/models/google/movenet/frameworks/tfJs/variations/singlepose-lightning/versions/4/model.json?tfjs-format=file&tfhub-redirect=true')

For TF.js to work on the frontend, you'll need to handle fetching all the other *.bin files that make up the model.

@pranjalieedee
Copy link

For those who are with this problem, let me tell you I fixed doing that:

  1. Downloaded model from https://www.kaggle.com/models/mediapipe/face-detection/frameworks/tfJs/variations/short/versions/1/
  2. Served the two files in my own domain. Get the path to the model.json file
  3. Use this path in detectorModelUrl key.
import { MediaPipeFaceDetectorTfjsModelConfig } from '@tensorflow-models/face-detection/dist/tfjs/types';
import * as faceDetection from '@tensorflow-models/face-detection';
import '@mediapipe/face_detection';


const { MediaPipeFaceDetector } = faceDetection.SupportedModels;
const detectorConfig: MediaPipeFaceDetectorTfjsModelConfig = {
    runtime: 'tfjs',
    maxFaces: 2,
    detectorModelUrl:
       'https://[yourdomain]/model.json',
};
faceModel.current = await faceDetection.createDetector(
    MediaPipeFaceDetector,
    detectorConfig,
);

I tried doing the same.
const detectorConfig = { modelType: poseDetection.movenet.modelType.SINGLEPOSE_LIGHTNING, modelUrl: };
I also tried writing a different API that fetches the model from kaggle url, adds CORS headers and sends to front end.

router.get("/modelWrapper", async (req, res) => {
  try {
    console.log("********** poseDetector WRAPPER *************");
    const response = await fetch('https://www.kaggle.com/models/google/movenet/frameworks/tfJs/variations/singlepose-lightning/versions/4/model.json?tfjs-format=file&tfhub-redirect=true');
    
    console.log("********** RESPONSE WRAPPER *************");
    // Check if response is successful
    if (!response.ok) {
      throw new Error('Failed to fetch resource');
    }

    // Get response body as JSON
    const data = await response.json();

    // Add CORS headers
    res.setHeader('Access-Control-Allow-Origin', '*');
    console.log("********** HEADER WRAPPER *************");
    // Send response back to frontend
    res.json(data);
  } catch (error) {
    console.error('Error:', error.message);
    res.status(500).send('Internal server error');
  }
});

However, in both cases, the weights need to be loaded from the same kaggle URL and I am not finding how to download it? group1-shard2of2.bin `group2-shard2of2.bin
Any suggestions on fixing this would be helpful. The issue is happening in our live Production environment .

@pranjalieedee I don't know exactly what issue you're hitting without logs and/or a repro of your dev environment. But I know for self-hosting the model, you will need to host all the tfjs model files (model.json and *.bin) at the same path on your web app infrastructure. See the relevant docs on the expected TF.js filepaths here:

The intermediate API code snippet you shared only fetches the model.json file:

const response = await fetch('https://www.kaggle.com/models/google/movenet/frameworks/tfJs/variations/singlepose-lightning/versions/4/model.json?tfjs-format=file&tfhub-redirect=true')

For TF.js to work on the frontend, you'll need to handle fetching all the other *.bin files that make up the model.

Thank you. I downloaded the files from https://www.kaggle.com/models/google/movenet/frameworks/tfJs/variations/singlepose-lightning/versions/4 and am serving them locally.

We have been able to get the system up and running.

@KeijiBranshi
Copy link
Collaborator

Thanks all again for sharing. Since the issue is resolved, I'm going to close these related threads out (#908, #909, #904).

Copy link

Are you satisfied with the resolution of your issue?
Yes
No

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

8 participants