Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loading models outside the project #5

Closed
danielbank opened this issue Jul 20, 2020 · 9 comments
Closed

Loading models outside the project #5

danielbank opened this issue Jul 20, 2020 · 9 comments

Comments

@danielbank
Copy link
Contributor

danielbank commented Jul 20, 2020

Hi Benjamin,

I'm a big fan of your tractjs project and am excited to use it to do ML on the web. For my particular use case, I want to load an arbitrary model which I don't want bundled into my code. In my experimenting, I have tried the following two approaches:

Load the model from the file system doing something like:

const model = new tractjs.Model('file:///Users/Bankster/Downloads/mobilenet_v2_1.4_224_frozen.pb');

This fails with an error:

Fetch API cannot load file:///Users/Bankster/Downloads/mobilenet_v2_1.4_224_frozen.pb. URL scheme "file" is not supported.

It makes sense that I wouldn't be able to use the fetch API to access local files, that would be a security issue for Chrome.

Serve the model as a static file from a web server and try to provide the endpoint to the constructor:

const model = new tractjs.Model('http://localhost:8080/models/mobilenet_v2_1.4_224_frozen.pb');

If I do this, I get the following error:

Error: TractError(
  Msg(
      "Translating node #0 \"input\" Source ToTypedTranslator",
  ),
  ...
)

My guess is that the model is being chunked and Tract doesn't expect that. I will keep exploring this second approach but was curious if you had any ideas on this matter.

Edit: I checked the response header and see Content-Length: 24508794 so it isn't a chunking issue. Maybe a MIME type or something...

@kali
Copy link

kali commented Jul 20, 2020

Nope, the format looks right, just a crappy error message (fully a tract side issue). Did you give the size hints ?

@danielbank
Copy link
Contributor Author

Not exactly sure what you mean by size hints but my guess is no.

In the JS, I am just giving the endpoint:

const model = new tractjs.Model('http://localhost:8080/models/mobilenet_v2_1.4_224_frozen.pb');

The server is a simple Rocket server serving static files:

fn main() -> Result<(), Error> {
    let cors = CorsOptions::default().to_cors()?;
    rocket::ignite()
        .attach(cors)
        .mount(
            "/models",
            StaticFiles::from(concat!(env!("CARGO_MANIFEST_DIR"), "/models")),
        )
        .launch();
    Ok(())
}

The models directory contains the untarred mobilenet.pb from the tract tensorflow example code.

@kali
Copy link

kali commented Jul 20, 2020

OK, look at the first bullet point in tractjs FAQ :) This looks like a tensorflow model, so the input fact sizes should be [1, 224, 224, 3] (tensorflow and onnx use respectively NHWC and NCHW convention by default).

@danielbank
Copy link
Contributor Author

danielbank commented Jul 20, 2020

OK I see! The model loads after I add the { optimize: false } option. I'll have to figure out what the input dimensions are but I'm sure I can get it.

I find it curious that the examples don't need the input dimensions passed in:

this.model = await new tractjs.Model(rURL("squeezenet1_1.onnx"));

How is it able to infer the input fact sizes in one case but not the other? 🤔

@kali
Copy link

kali commented Jul 20, 2020

it's just that onnx models usually embed their input size, and not tensorflow ones. i'm pretty sure the shape vector i gave you before is the one for mobilenet, but looking at the doc or sample code that shoud accompany the model should help.

@bminixhofer
Copy link
Owner

Hi Daniel!

Glad you were able to resolve this. Thanks @kali for chiming in. I agree that a better error message would be nice though. Can I assume that if into_optimized fails it is always this kind of error @kali ? Then I'd be able to replace this with a custom error message that links to the FAQ.

@kali
Copy link

kali commented Jul 20, 2020

@bminixhofer well, the full error message (when unfolding causes) actually looks like this using the CLI. Not exactly perfect, not too sure why it's not nesting consistently, I will have a look. And in the case of a Source node, I could actually give a hint at what should be done ("help: considering providing an input fact with the input shape").

But it's only useful if the deceloper see the full message. Is it possible that tractjs only gives the top level context (Translating...) instead of the full error ?

[2020-07-20T07:49:34.388916051Z ERROR tract] TractError(Msg("Translating node #0 \"input\" Source ToTypedTranslator"), State { next_error: Some(TractError(Msg("Output type not determined"), State { next_error: None, backtrace: InternalBacktrace { backtrace: None } })), backtrace: InternalBacktrace { backtrace: None } })
[2020-07-20T07:49:34.389247709Z ERROR tract] Can not make a TypedFact out of ?x?x?x3xF32
Error: Can not make a TypedFact out of ?x?x?x3xF32

@bminixhofer
Copy link
Owner

Oh good point, that's possible. And yes, that full message would definitely be enough information. I'll have a look.

@kali
Copy link

kali commented Jul 20, 2020

OK, fixed it on tract main. (@bminixhofer using error-chain's ChainError::display_chain).

[2020-07-20T08:30:44.108869119Z ERROR tract] Error: Translating node #0 "input" Source ToTypedTranslator
    Caused by: Source node without a determined fact. Help: provide explicit input facts to your model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants