-
Notifications
You must be signed in to change notification settings - Fork 513
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
backend with @tensorflow/tfjs-node ? #25
Comments
Hey @AZOPCORP - so I've recently solved how to do this! (I think) I made a test, and for that to work it had to run 100% in node. This test runs a classification on the logo without accessing a browser. I hope the setup code in this test works for you. I'm sure it could be better wrapped in the actual lib though. If you do use this lib on a backend, please contribute back any improvements 👍 |
Thanx for your awesome job! Below a working eg with tfjs-node based on your test: const tf =require('@tensorflow/tfjs-node')
const load=require('./dist/index').load
const fs = require('fs');
const jpeg = require('jpeg-js');
// Fix for JEST
const globalAny = global
globalAny.fetch = require('node-fetch')
const timeoutMS = 10000
const NUMBER_OF_CHANNELS = 3
const readImage = (path) => {
const buf = fs.readFileSync(path)
const pixels = jpeg.decode(buf, true)
return pixels
}
const imageByteArray = (image, numChannels) => {
const pixels = image.data
const numPixels = image.width * image.height;
const values = new Int32Array(numPixels * numChannels);
for (let i = 0; i < numPixels; i++) {
for (let channel = 0; channel < numChannels; ++channel) {
values[i * numChannels + channel] = pixels[i * 4 + channel];
}
}
return values
}
const imageToInput = (image, numChannels) => {
const values = imageByteArray(image, numChannels)
const outShape = [image.height, image.width, numChannels] ;
const input = tf.tensor3d(values, outShape, 'int32');
return input
}
(async()=>{
const model = await load('file://./model/')//moved model at root of folder
const logo = readImage(`./_art/nsfwjs_logo.jpg`)
const input = imageToInput(logo, NUMBER_OF_CHANNELS)
console.time('predict')
const predictions = await model.classify(input)
console.timeEnd('predict')
console.log(predictions)
})() |
very cool! Maybe we should put an example in the demo folder? |
Something like a way to pass some options to configure lib using tfjs-node and tfjs-node-gpu would be much apreciated I think. BTW i will make a backend demo as soon as i have the time. |
AZOPCORP, I am getting this Error: Request for file://.model/model.json failed due to error: TypeError: Only HTTP(S) protocols are supported. |
Closing because question moved to https://github.com/infinitered/nsfwjs/wiki/FAQ:-NSFW-JS |
For information, if you are still wondering how to run it on Node.js, this guy's code works https://github.com/mishazawa/nums When you do const nsfwjs = require('nsfw/dist') He basically forked the code to It would be nice and not too much work to have this published to npmjs.org. Similarly you can use const toxicity = require('@tensorflow-models/toxicity')
const sentences = ['I love C++']
toxicity.load(0.9).then(model =>
model.classify(sentences).then(predictions => ...)
) What I would like to have from a developer point of view is to be able to use it out of the box the same way, const image = ...
nsfwjs.load().then(model => // Instead of nsfwjs.load('file://..../model/')
model.classify(image).then(predictions => ...)
) |
I wish he had contributed back with a Pull Request. @mycaule - would you be willing to take a shot? If not, this is something I could get around to at some point. |
Ok I will try to do this week, the workflow between NPM and your build might be something I can't test though. For the fetching of this line, Lines 21 to 22 in 0497856
it might save you some S3 costs. const mobilenet = require('@tensorflow-models/mobilenet')
mobilenet.load().then(model => model.classify(...).then(predictions => ...)
// Downloads a file from tfhub in the background
// https://tfhub.dev/google/imagenet/mobilenet_v1_100_224/classification/1/model.json?tfjs-format=file See #224 |
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
could it be possible to port the lib for backend use via tfjs-node and node canvas?
The text was updated successfully, but these errors were encountered: