Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to load model from local (nodejs) #689

Open
azurenekowo opened this issue Nov 15, 2022 · 9 comments
Open

Failed to load model from local (nodejs) #689

azurenekowo opened this issue Nov 15, 2022 · 9 comments

Comments

@azurenekowo
Copy link

azurenekowo commented Nov 15, 2022

Greetings. As the title have said, I am clueless about how to load a local model. More details below.

Here is my codebase:

global.model = null
async function loadModel() {
    const loadfrom = 'file://' + __dirname.replace(/\\/g, '/') + '/mobilenet_v2_140_224/web_model/'
    console.log(loadfrom)
    const tf = require('@tensorflow/tfjs-node')
    const nsfw = require('nsfwjs')
    model = await nsfw.load(loadfrom, { type: 'graph' })
}
loadModel()

(Ignore the regex, it only changes "" to "/" because I'm on a Windows machine. removing that produces the same error)

Console output:

file://C:/Users/user/Desktop/demo/mobilenet_v2_140_224/web_model/
C:\Users\user\Desktop\demo\node_modules\node-fetch\lib\index.js:1327
                throw new TypeError('Only absolute URLs are supported');
                      ^

TypeError: Only absolute URLs are supported
    at getNodeRequestOptions (C:\Users\user\Desktop\demo\node_modules\node-fetch\lib\index.js:1327:9)
    at C:\Users\user\Desktop\demo\node_modules\node-fetch\lib\index.js:1440:19
    at new Promise (<anonymous>)
    at fetch (C:\Users\user\Desktop\demo\node_modules\node-fetch\lib\index.js:1437:9)
    at HTTPRequest.PlatformNode.fetch (C:\Users\user\Desktop\demo\node_modules\@tensorflow\tfjs-core\dist\tf-core.node.js:7430:16)
    at HTTPRequest.<anonymous> (C:\Users\user\Desktop\demo\node_modules\@tensorflow\tfjs-core\dist\tf-core.node.js:8289:55)
    at step (C:\Users\user\Desktop\demo\node_modules\@tensorflow\tfjs-core\dist\tf-core.node.js:125:27
    at Object.next (C:\Users\user\Desktop\demo\node_modules\@tensorflow\tfjs-core\dist\tf-core.node.js:74:53)
    at C:\Users\user\Desktop\demo\node_modules\@tensorflow\tfjs-core\dist\tf-core.node.js:67:71
    at new Promise (<anonymous>)

It happens with both with and without the file:// prefix.

Interestingly, if I put in a relative path that is the same format with the one in Node.js app demo on this repo readme:
file:// ./mobilenet_v2_140_224/web_model/, it threw out TypeError [ERR_INVALID_URL]: Invalid URL.

Paths I have tried:
file://./mobilenet_v2_140_224/web_model/ (removed space) => TypeError: Only HTTP(S) protocols are supported
file:///mobilenet_v2_140_224/web_model/ => TypeError: Only absolute URLs are supported
./mobilenet_v2_140_224/web_model/ => TypeError: Only absolute URLs are supported
/mobilenet_v2_140_224/web_model/ => TypeError: Only absolute URLs are supported

Both issues I have found related to locally loading the model are still open to this day: #512 #522 #652

Models I have used:
https://github.com/infinitered/nsfwjs/tree/master/example/nsfw_demo/public/quant_mid
https://github.com/GantMan/nsfw_model/releases/tag/1.1.0

Am I doing something wrong or the documentation is obsecure that I couldn't find anything? Calling load() loads the default model which turns out predicts images differently than the models on nsfwjs.com.

Any help would be greatly appreciated.

@cdanwards
Copy link
Member

@scarletzumii sorry that you're experiencing this outstanding bug. We're taking some time to look into it!

@huhm
Copy link

huhm commented Feb 16, 2023

overide the load function ,and post fetchFunc option to tf.loadLayersModel/tf.loadGraphModel

@sonofmagic
Copy link

I meet the same issue. I download the latest nsfw_model from GantMan/nsfw_model's releases page.

And fail to load layer model web_model or web_model_quantized.

Would you provide a example which loads local model ? Thanks.

@huhm
Copy link

huhm commented Feb 27, 2023

You can overide the load method like this

import { NSFWJS } from 'nsfwjs';
 const baseUrl = path.join(__dirname, '../xxx/quant_mid/');
const nsfwIns=new NSFWJS(baseUrl, {
    size: 224,
    type: 'graph',
  })
overideLoad(nsfwIns, baseUrl);
function overideLoad(context: NSFWJS, modelBaseUrl: string) {
  context.load = async function nsfwnetOverideLoad() {
    const { size, type } = this.options;
    /**
     * @type String xxx/model.json
     */
    const pathOrIOHandler = this.pathOrIOHandler;
    const loadOptions = {
      onProgress: (fraction: number) => {
        {
          console.log(`ModelLoad onProgress:${(fraction * 100).toFixed(1)}%`);
        }
      },
      fetchFunc(fpath: string) {
        let curPath = fpath;
        //ForWindows
        if (!fs.existsSync(curPath)) {
          curPath = path.resolve(modelBaseUrl, './' + fpath);
        }
        console.log('ModelLoad file: ' + fpath, curPath);
        return import('node-fetch').then(({ Response: fetchResponse }) => {
          return new Promise((resolve, reject) => {
            fs.readFile(curPath, (err, data) => {
              if (err) {
                reject(err);
                return;
              }
              resolve(
                new fetchResponse(data, {
                  //   headers:{'Content-Type':
                  // }
                }),
              );
            });
          });
        });
      },
    };
    if (type === 'graph') {
      this.model = await tf.loadGraphModel(pathOrIOHandler, loadOptions);
    } else {
      // this is a Layers Model
      this.model = await tf.loadLayersModel(pathOrIOHandler, loadOptions);
      this.endpoints = this.model.layers.map((l) => l.name);
    }

    // Warmup the model.
    const result = tf.tidy(() =>
      this.model.predict(tf.zeros([1, size, size, 3])),
    ) as tf.Tensor;
    await result.data();
    result.dispose();
  };
}

@xxaier
Copy link

xxaier commented Apr 27, 2023

image

var bin, img, model_fp, nsfw;

import ROOT from './ROOT';

import {
  join
} from 'path';

import {
  NSFWJS
} from 'nsfwjs';

import tf from '@tensorflow/tfjs-node';

await tf.enableProdMode();

await tf.ready();

model_fp = 'file://' + join(ROOT, 'model/quant_mid/model.json');

nsfw = new NSFWJS(0, {
  size: 224
});

nsfw.load = async function() {
  this.model = (await tf.loadGraphModel(model_fp));
};

await nsfw.load();

console.log(nsfw.classify);

import {
  readFileSync
} from 'fs';

img = readFileSync('/Users/z/art/pkg/bot/adult/out/9/0.png');

bin = tf.node.decodeImage(img, 3);

console.log((await nsfw.classify(bin)));

bin.dispose();

@JemiloII
Copy link

None of these work without actually sending a request. I don't want to send any requests. Any better way to load this without node-fetch or better yet, maybe use torch instead of tensorflow?

@GantMan
Copy link
Member

GantMan commented Jun 19, 2023

Is there a good torch for JS solution?

@miaoihan
Copy link

And? Is there no one to solve the problem?

@GantMan
Copy link
Member

GantMan commented Jul 22, 2023

We're nearing a sprint to make NSFWJS work well on mobile.

If we have the time/resources we'll come back and look at this. It would be great if someone who figures it out to post their solution.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants