-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Loading models outside the project #5
Comments
Nope, the format looks right, just a crappy error message (fully a tract side issue). Did you give the size hints ? |
Not exactly sure what you mean by size hints but my guess is no. In the JS, I am just giving the endpoint:
The server is a simple Rocket server serving static files:
The models directory contains the untarred |
OK, look at the first bullet point in tractjs FAQ :) This looks like a tensorflow model, so the input fact sizes should be [1, 224, 224, 3] (tensorflow and onnx use respectively NHWC and NCHW convention by default). |
OK I see! The model loads after I add the I find it curious that the examples don't need the input dimensions passed in:
How is it able to infer the input fact sizes in one case but not the other? 🤔 |
it's just that onnx models usually embed their input size, and not tensorflow ones. i'm pretty sure the shape vector i gave you before is the one for mobilenet, but looking at the doc or sample code that shoud accompany the model should help. |
@bminixhofer well, the full error message (when unfolding causes) actually looks like this using the CLI. Not exactly perfect, not too sure why it's not nesting consistently, I will have a look. And in the case of a Source node, I could actually give a hint at what should be done ("help: considering providing an input fact with the input shape"). But it's only useful if the deceloper see the full message. Is it possible that tractjs only gives the top level context (Translating...) instead of the full error ?
|
Oh good point, that's possible. And yes, that full message would definitely be enough information. I'll have a look. |
OK, fixed it on tract main. (@bminixhofer using error-chain's ChainError::display_chain).
|
Hi Benjamin,
I'm a big fan of your tractjs project and am excited to use it to do ML on the web. For my particular use case, I want to load an arbitrary model which I don't want bundled into my code. In my experimenting, I have tried the following two approaches:
Load the model from the file system doing something like:
This fails with an error:
It makes sense that I wouldn't be able to use the fetch API to access local files, that would be a security issue for Chrome.
Serve the model as a static file from a web server and try to provide the endpoint to the constructor:
If I do this, I get the following error:
My guess is that the model is being chunked and Tract doesn't expect that. I will keep exploring this second approach but was curious if you had any ideas on this matter.
Edit: I checked the response header and see
Content-Length: 24508794
so it isn't a chunking issue. Maybe a MIME type or something...The text was updated successfully, but these errors were encountered: