Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

texture size [0x0] is invalid #377

Closed
L1391 opened this issue May 22, 2019 · 3 comments

Comments

@L1391
Copy link

commented May 22, 2019

I am trying to use ml5 to create my own image classifier for different health of coral. I want to be able to use my images to train and test the classifier. I am able to upload the images, train the data, but cannot classify an image.
I get only this error: texture size [0x0] is invalid

I am using Node.js on Chrome Version 74.0.3729.157. I am using ml5 0.2.5 and p5 0.8.0

Here is my code: train() and predict() are called by button clicks


let labels = ['bleach', 'dead', 'health'];
let mobilenet;
let classifier;
let img = new Image();
let pimg = new Image();

function setup() {
    mobilenet = ml5.featureExtractor('MobileNet', modelReady);
    classifier = mobilenet.classification();

    labels.forEach(function(entry) {
//add 17 training images for each label
        for (let i = 1; i <= 17; i++) {
            let path = 'coral/' + entry + '/' + entry + i + '.jpg';
            img.src = path;
            classifier.addImage(img, entry, imageReady);
        }
    });

}

function imageReady() {
    console.log("image loaded");
}


function predict() {
    pimg.src = 'coral/health/health20.jpg';
    classifier.classify(pimg, gotResult);
}

function gotResult(result) {
    console.log(result);
}

function train() {
    classifier.train(function(lossValue) {
        if (lossValue) {
            loss = lossValue;
            console.log(loss);
        } else {
            console.log('Done Training! Final Loss: ' + loss);
        }
    });
}

function modelReady() {
    console.log("Model ready");
}





@joeyklee

This comment has been minimized.

Copy link
Contributor

commented May 22, 2019

@L1391 - thanks so much for using the library and for raising this issue. Unfortunately, ml5 requires the use of the browser to run so the ml5 functionality won't work without it (e.g. in node.js). ml5.js uses tensorflow.js which uses the browser's GPU to run all the fancy calculations. As a result, all of the functionality that ml5.js is built on is based around using the browser GPU.

We hope to have ml5.js run in node-js sometime in the near future (especially now that node support for tensorflow is a thing: https://www.tensorflow.org/js/guide/nodejs) but the current ml5 setup does not support node.js

Thanks + happy coding!

@joeyklee joeyklee closed this May 22, 2019

@L1391

This comment has been minimized.

Copy link
Author

commented May 22, 2019

Could I use the p5 online editor? Is that a “browser”?

@joeyklee

This comment has been minimized.

Copy link
Contributor

commented May 22, 2019

Hi @L1391 - Yes, but there's some caveats:

Could I use the p5 online editor?

  • Yes, you can use the p5 web editor

Is that a “browser”?

  • Sort of. The p5 web editor runs in the browser but it is not a browser.

A number of the ml5 sketches don't currently work in the p5 web editor due to some of the ways that the editor handles data files and some of the network communication regarding making requests to external data (e.g. the big model files that allow ml5.js to run things like image detection, etc). Some of those issues have popped up here (ml5js/ml5-examples#6)

There are lots of developments in the p5 web editor as well as in ml5 to make sure these environments all play nicely together, but the best thing to do is to try and run things locally if possible. Thanks!

@shiffman @wenqili - we should consider adding a FAQ to the website to document these really good & common questions :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
2 participants
You can’t perform that action at this time.