Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: Unsupported tensor DataType: DT_INT64, try to modify the model in python to convert the datatype #23

Closed
loretoparisi opened this issue Oct 2, 2020 · 11 comments
Assignees
Labels
enhancement New feature or request

Comments

@loretoparisi
Copy link

Hello,
I'm trying to convert this TFJS GraphModel:

tfconv.loadGraphModel(
      'https://tfhub.dev/tensorflow/tfjs-model/toxicity/1/default/1',
      { fromTFHub: true })

The conversion works without any issues with the command tfjs_graph_converter --output_format tf_saved_model ./ ./saved/
But, when I try to load the saved model

tf.node.loadSavedModel(this.path)];

I get the error

(node:39361) UnhandledPromiseRejectionWarning: Error: Unsupported tensor DataType: DT_INT64, try to modify the model in python to convert the datatype
@loretoparisi
Copy link
Author

Okay this should be related to this PR tensorflow/tfjs#4008

@patlevin
Copy link
Owner

patlevin commented Oct 2, 2020

I'm not sure I understand what you are trying to do.
The model in question is a TFJS model, so there is no need to convert it into a saved model; you can just load it directly into node. The model works OOTB with nodejs and in the browser.

@loretoparisi
Copy link
Author

loretoparisi commented Oct 2, 2020 via email

@patlevin
Copy link
Owner

patlevin commented Oct 2, 2020

Thanks for the explanation! Since I use TF with Python and C++ only, I wasn't aware of this limitation.
The converter currently doesn't guarantee to output models that can be loaded by TFJS.
I could add a flag to keep the data types compatible, though, if that helps.

@patlevin patlevin added the enhancement New feature or request label Oct 2, 2020
@patlevin patlevin self-assigned this Oct 2, 2020
@patlevin
Copy link
Owner

patlevin commented Oct 3, 2020

@loretoparisi I have bad news on this one. Unfortunately you'll have to wait for the TFJS team to release an update.
Keeping the types compatible would require rewriting the entire graph, identifying and optimising away redundant nodes (e.g. type cast operations), and risking differences in behaviour (though I guess overflow/underflow issues are unlikely, but still).

The reason is that if I change the weight node types, all operations that use them as inputs will need to have their input- and output types changed as well. The latter is the actual problem since the type change now cascades to all nodes connected to that node...

I still might give it a try just for exercise, but don't count on it.

@loretoparisi
Copy link
Author

Thanks a lot Patrick, it makes sense. Hopefully the DINT64 support will be ready in a month or so!

@patlevin
Copy link
Owner

patlevin commented Oct 5, 2020

@loretoparisi Good news! I managed to solve the problem by converting incompatible inputs in the graph.
I tested the result in tf.node and the converted model loaded just fine.

The new version will be available on PyPi in just a few moments.

@loretoparisi
Copy link
Author

loretoparisi commented Oct 5, 2020

@patlevin wow, that's amazing!!! 💯 🥇 I will test as well and back to you!

@loretoparisi
Copy link
Author

loretoparisi commented Oct 5, 2020

@patlevin Just to be sure we are working on the same model.

mbploreto:toxicity_model loretoparisi$ tfjs_graph_converter --output_format tf_saved_model ./ ./saved/
TensorFlow.js Graph Model Converter

Graph model:    ./
Output:         ./saved/
Target format:  tf_saved_model

Converting.... Done.
Conversion took 1.775s
mbploreto:toxicity_model loretoparisi$ tfjs_graph_converter --version

tfjs_graph_converter 1.4.0

Dependency versions:
    tensorflow 2.3.1
    tensorflowjs 2.4.0

I have downloaded the model from the TFHub here and then run the conversion.

This is my JavaScript test:

const tfjsnode = require('@tensorflow/tfjs-node');
const tfconv = require("@tensorflow/tfjs-converter");

var loadGraphModel = function (url) {
  return new Promise(function (resolve, reject) {
    tfconv.loadGraphModel(url,
      { fromTFHub: true })
      .then(res => {
        console.log("loadGraphModel");
        resolve(res);
      })
      .catch(err => reject(err));
  });
}
var loadSavedModel = function (path) {
  return new Promise(function (resolve, reject) {
    tfjsnode.node.loadSavedModel(path)
      .then(res => {
        console.log("loadSavedModel");
        resolve(res);
      })
      .catch(err => reject(err));
  });
}
loadGraphModel('https://tfhub.dev/tensorflow/tfjs-model/toxicity/1/default/1')
  .catch(err => console.error("loadGraphModel", err));
loadSavedModel('/Users/loretoparisi/webservice/toxicity_model/saved')
  .catch(err => console.error("loadSavedModel", err));

this is the current output

$ node load.js
Platform node has already been set. Overwriting the platform with [object Object].
Platform node has already been set. Overwriting the platform with [object Object].
node-pre-gyp info This Node instance does not support builds for N-API version 6
node-pre-gyp info This Node instance does not support builds for N-API version 6
2020-10-05 12:16:46.423579: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2020-10-05 12:16:46.469215: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x1094005d0 initialized for platform Host (this does not guarantee that XLA will be used). Devices:
2020-10-05 12:16:46.469265: I tensorflow/compiler/xla/service/service.cc:176]   StreamExecutor device (0): Host, Default Version

loadSavedModel Error: Unsupported tensor DataType: DT_INT64, try to modify the model in python to convert the datatype
    at mapTFDtypeToJSDtype (/Users/loretoparisi/Documents/MyProjects/AI/tfjs-
models/toxicity/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:469:19)
    at /Users/loretoparisi/Documents/MyProjects/AI/tfjs-models/toxicity/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:161:57
    at step (/Users/loretoparisi/Documents/MyProjects/AI/tfjs-models/toxicity/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:48:23)
    at Object.next (/Users/loretoparisi/Documents/MyProjects/AI/tfjs-models/toxicity/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:29:53)
    at fulfilled (/Users/loretoparisi/Documents/MyProjects/AI/tfjs-models/toxicity/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:20:58)

loadGraphModel OK

so it seems the problem is still there

loadSavedModel Error: Unsupported tensor DataType: DT_INT64, try to modify the model in python to convert the datatype

but maybe it's my fault at this point...

@patlevin
Copy link
Owner

patlevin commented Oct 5, 2020

@loretoparisi You need to use the compatibility flag:

tfjs_graph_converter --output_format --compat_mode tf_saved_model ./ ./saved/

The compatibility-mode is optional - the default behaviour is to keep all types as-is.

@loretoparisi
Copy link
Author

loretoparisi commented Oct 5, 2020

@patlevin ah right!!

tfjs_graph_converter --output_format tf_saved_model --compat_mode ./ ./saved/
TensorFlow.js Graph Model Converter

Graph model:    ./
Output:         ./saved/
Target format:  tf_saved_model

Converting.... Done.
Conversion took 1.667s

...
2020-10-05 12:28:55.915996: I tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /Users/loretoparisi/webservice/toxicity_model/saved
2020-10-05 12:28:55.943074: I tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
2020-10-05 12:28:56.009469: I tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
2020-10-05 12:28:56.010138: I tensorflow/cc/saved_model/loader.cc:212] The specified SavedModel has no variables; no checkpoints were restored. File does not exist: /Users/loretoparisi/webservice/toxicity_model/saved/variables/variables.index
2020-10-05 12:28:56.010284: I tensorflow/cc/saved_model/loader.cc:311] SavedModel load for tags { serve }; Status: success. Took 94285 microseconds.
loadSavedModel OK

Perfect it works! Thank you! 🥇

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants