Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Different results for TFJS-Web vs Keras/TFJS-Node on same model and tensor #776

Closed
ixio opened this issue Oct 10, 2018 · 3 comments
Closed
Assignees

Comments

@ixio
Copy link

ixio commented Oct 10, 2018

TensorFlow.js version

Node:

  • @tensorflow/tfjs 0.13.1
  • @tensorflow/tfjs-node 0.1.18

Python:

  • keras 2.2.4
  • tensorflowjs 0.6.2
  • tensorflow 1.11.0

Browser version

Firefox: 62.0.3 (64-bit)
Chromium: Version 69.0.3497.81 (Official Build) Built on Ubuntu , running on Ubuntu 18.04 (64-bit)

Describe the problem or feature request

I wanted to use a Keras-trained model in a JS application (Cordova/Ionic) so I used TensorFlow.js Python conversion however my test in a webpage shows different results from my Python predictions.

I made a small POC with the model and tensor (saved in JSON) and I posted a StackOverflow question: https://stackoverflow.com/questions/52683723/different-results-for-tensorflowjs-and-keras-on-same-model-and-tensor with no answers.

However I've now tried a small node script and I get the same result as the Python script, prediction of 0, while the browser version still predicts 1. I'm pretty sure that's a bug, right?

Code to reproduce the bug / link to feature request

Webpage version:
(I test it with python3 -m http.server)

// https://stackoverflow.com/a/18324384/2730032
function callAjax(url, callback){
  var xmlhttp;
  // compatible with IE7+, Firefox, Chrome, Opera, Safari
  xmlhttp = new XMLHttpRequest();
  xmlhttp.onreadystatechange = function(){
    if (xmlhttp.readyState == 4 && xmlhttp.status == 200){
      callback(xmlhttp.responseText);
    }
  }
  xmlhttp.open("GET", url, true);
  xmlhttp.send();
}

tf.loadModel('/model.json').then(model => {
  callAjax('/tensor.json', res => {
    arr = JSON.parse(res);
    imgs = tf.tensor([arr])
    model.predict(imgs).print()
  });
});

Result: Tensor [[1],]

Node version:
(I test it with node test.js)

require('@tensorflow/tfjs-node') // npm install @tensorflow/tfjs-node
var tf = require('@tensorflow/tfjs') // npm install @tensorflow/tfjs

test = async () => {
  const model = await tf.loadModel('file://model.json')

  arr = require('./tensor.json')
  imgs = tf.tensor([arr])

  model.predict(imgs).print()
}

test()

Result: Tensor [[0],]

DATA: poc_bug.zip

@bileschi bileschi self-assigned this Oct 12, 2018
@caisq
Copy link
Collaborator

caisq commented Oct 13, 2018

@ixio Thanks for reporting this issue. @bileschi I spend a few minutes trying to reproduce this issue on my machine but wasn't able to.

Here are the environments in which I tested it

  1. Keras 2.2.4 (backend: tensorflow CPU)
  2. tensorflowjs (v0.6.4) as in the test.py that @ixio provided
  3. tfjs-node (v0.1.18)
  4. tfjs in the browser (v0.13.2), running on Linux + Chrome

In all these environments, with the input that @ixio provided, I always get [[0.],], i.e., same as the result from @ixio's node test.js.

When I run @ixio's test.json in node, I also get [[0.],].

In addition, I tried the following artificial input data.

  • In python, it can be synthesized as
    xs = np.concatenate([
        np.zeros([1, 150, 150, 3]), np.ones([1, 150, 150, 3])],
        axis=0)
  • In js, it can be synthesized as
    const xs = tf.concat([tf.zeros([1, 150, 150, 3]), tf.ones([1, 150, 150, 3])]);
    

The four environments above always give the same result (up to numeric precision):
[[2.4286556e-04]
[3.5258019e-01]] (Python)

Tensor
[[0.0002429],
[0.3525799]] (JS)

So my gut feeling is that there is something peculiar with the way the python3 -m http.server serves the binary weight values. @ixio Can you print out the weight values from the model after loading from your HTTP server and verify if they are the same as the values in Python? You can do this with something like: model.getWeights()[0].print()

In my browser (as well as my node.js environment), I get:


Tensor
    [[[[-0.1057436, -0.0803647, -0.0929867, ..., -0.1009188, -0.1444557, 0.0756956 ],
       [-0.0756691, 0.0646043 , -0.0005814, ..., 0.0867353 , -0.0186567, 0.0513486 ],
       [0.0802182 , 0.1376742 , 0.0923856 , ..., 0.0081945 , 0.0297689 , 0.0988621 ]],

      [[0.0176931 , 0.0583065 , 0.029618  , ..., 0.0688405 , -0.1353827, -0.0752142],
       [0.0669717 , -0.1103623, 0.0765599 , ..., -0.1193889, -0.0681074, -0.1075244],
       [-0.1077253, 0.0322383 , 0.0506719 , ..., -0.100226 , -0.0265293, -0.1188309]],

      [[-0.074231 , -0.0661954, 0.1063949 , ..., 0.0466323 , -0.1159561, -0.0600767],
       [0.0364849 , 0.1173892 , -0.0281694, ..., -0.0742352, -0.0999415, 0.0709506 ],
       [0.0169177 , -0.0314741, 0.0881025 , ..., -0.069786 , 0.0582412 , 0.0736627 ]]],


     [[[-0.0965689, 0.0174581 , 0.0294235 , ..., -0.0297376, -0.0689884, -0.0653607],
       [-0.1053417, 0.0048439 , -0.095906 , ..., -0.0284573, 0.0744983 , -0.1032496],
       [0.0947872 , -0.096565 , -0.0669127, ..., -0.086561 , -0.0780539, 0.1302573 ]],

      [[-0.0672446, -0.0330578, -0.0327127, ..., -0.0696173, -0.1180843, -0.0691746],
       [-0.1521278, -0.0740436, 0.105202  , ..., 0.0719692 , 0.0440298 , 0.1018965 ],
       [0.0534884 , -0.1493521, 0.0859028 , ..., -0.061255 , 0.1008514 , -0.1187244]],

      [[-0.0039256, 0.0439057 , 0.0327799 , ..., -0.029168 , -0.0184119, -0.0130499],
       [0.0702573 , 0.0632861 , 0.1153234 , ..., -0.078042 , 0.0148197 , -0.0308419],
       [0.1101074 , 0.0121241 , 0.0817019 , ..., 0.0726327 , -0.1145874, 0.0563667 ]]],


     [[[0.0437085 , 0.0599865 , -0.0660994, ..., 0.1362128 , -0.1257357, -0.141617 ],
       [-0.1166292, 0.0104953 , -0.1265896, ..., 0.1379085 , 0.1010464 , 0.0093707 ],
       [-0.0894183, 0.1219736 , -0.0419712, ..., 0.0824705 , -0.0997329, 0.0531115 ]],

      [[0.1284384 , 0.0460285 , 0.09834   , ..., 0.0652722 , 0.108386  , -0.0080737],
       [-0.0547349, -0.1120388, 0.0615913 , ..., -0.019493 , -0.0555399, 0.1175768 ],
       [0.0412559 , 0.0285659 , 0.0619678 , ..., 0.0442174 , -0.0042151, 0.0048014 ]],

      [[0.0474609 , -0.0664359, -0.0710074, ..., -0.0367686, -0.045939 , 0.0962715 ],
       [0.112375  , -0.1650521, -0.0288336, ..., -0.1454948, 0.1375744 , -0.0162317],
       [0.0000199 , -0.1542492, -0.0059271, ..., -0.061023 , 0.1336539 , 0.0908976 ]]]]


You might want to check the other values as well.

@ixio
Copy link
Author

ixio commented Oct 18, 2018

Thank you for your feedback @caisq.

I've tried your artificial input data, on nodejs I get the same outputs as you however in Firefox and Chromium I get Tensor [[0.9999834], [0.4889681]] (which is once again not at all what we want).

I've tried model.getWeights()[0].print() however I get exactly the same values as you both in Firefox and Chromium. I also tried to print for all 10 layers but I get the same results in brower and in nodejs.

Here is the output but since it's the same in as in node I don't think it's going to help you:
chromium_print_layers.txt

I've tried using a js server instead of the python one: npm -g install serve; serve -s . I had to rename index.html into test and visit http://0.0.0.0:5000/test to get it to work. It fails without error in Firefox, but gives the same erronous response as before in Chromium.

I've also made a POC with Docker (https://docs.docker.com/install/) which will hopefully help you figure this one out, I used the proposed artificial input data:
docker_poc.zip
You can run it this way (in the docker folder of the zip):

docker build -t testpoctfjs .
docker run --rm testpoctfjs

After about 30 seconds you should see the result Tensor\\n [[0.6196299],\\n [0.488968 ]] which is yet another set of wrong values.

@ixio
Copy link
Author

ixio commented Nov 4, 2018

I tested my docker poc with tfjs@0.13.3 and it looks like it's working. Thank you, whoever is responsible, for fixing this.

@ixio ixio closed this as completed Nov 4, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants