Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tf.loadModel not working in ionic #272

Closed
hyun-yang opened this issue May 7, 2018 · 20 comments
Closed

tf.loadModel not working in ionic #272

hyun-yang opened this issue May 7, 2018 · 20 comments
Assignees
Labels
comp:layers P2 type:feature New feature or request

Comments

@hyun-yang
Copy link

hyun-yang commented May 7, 2018

TensorFlow.js version

0.10.0

Browser version

cli packages: (C:\Users\Administrator\AppData\Roaming\npm\node_modules)

@ionic/cli-utils  : 1.19.2
ionic (Ionic CLI) : 3.20.0

global packages:

cordova (Cordova CLI) : 7.1.0

local packages:

@ionic/app-scripts : 3.1.8
Cordova Platforms  : android 6.3.0
Ionic Framework    : ionic-angular 3.9.2

System:

Android SDK Tools : 25.2.5
Node              : v8.9.1
npm               : 5.6.0
OS                : Windows 10

Describe the problem or feature request

tf.loadModel not working, it fails to load model from local folder ( ie. assets/model ) however, web version is working, when it runs on
http://localhost:8100/ionic-lab

Code to reproduce the bug / link to feature request

import {Component} from '@angular/core';
import {IonicPage, AlertController} from 'ionic-angular';
import * as tf from "@tensorflow/tfjs";

@IonicPage()
@Component({
  selector: 'page-tfpretrainedversion',
  templateUrl: 'tfpretrainedversion.html',
})
export class TfpretrainedversionPage {

  kerasTraindedModel: tf.Model;
  KERAS_MODEL_JSON = 'assets/model/model.json';

  constructor(private httpClient: HttpClient,
              private alertCtrl: AlertController,
              private loadingService: LoadingServiceProvider) {
    this.loadPretrainedModel();
  }

  loadPretrainedModel() {

    tf.loadModel(this.KERAS_MODEL_JSON)
      .then((result) => {
        this.kerasTraindedModel = result;
      })
      .catch((error)=>{
        let prompt = this.alertCtrl.create({
          title: 'Error',
          subTitle: error,
          buttons: ['OK']
        });
        prompt.present();
      });
  }
}

here is a error message
screenshot_2018-05-08-05-43-54

and here is a data structure
pretrainedmodel

@tbrnd
Copy link
Contributor

tbrnd commented May 7, 2018

Hi!

tf.loadModel not working from local folder ( ie. assets ) however, web version is working, when it runs on
http://localhost:8100/ionic-lab

You should check #257, I think it will help you!

@hyun-yang
Copy link
Author

hyun-yang commented May 7, 2018

@timotheebernard
Thanks for your response.
Btw, let me clarify this feature request.

It works when I runs on my local sever http://localhost:8100/ionic-lab.

As you mentioned,

#257

when running your code you should see the following error:
Fetch API cannot load [...]/model.json. URL scheme must be "http" or "https" for CORS request.
You need to serve your model via a http server that allows CORS request for loading it.

What I really want to do is, I want to load this model from local folder( In this case from assets/model/ ) not using http server.

In ionic( I think lots of hybrid app platform has a same build process), when developer builds a native app for Android, iOS and Windows. Developer might want to load model from the local folder which is already packaged inside output file ( apk, ipa ) not using http server.

It'd be great if we have an API like a tf.loadModelFromLocal.

Thanks.

@tafsiri
Copy link
Contributor

tafsiri commented May 9, 2018

@hyun-yang to help us understand this use case, could you describe how you would generally load a file from a local folder in ionic?

@tafsiri tafsiri added type:feature New feature or request comp:layers labels May 9, 2018
@hyun-yang
Copy link
Author

@tafsiri
Thanks for your concern, I'll let you know when I uploaded test project for this.

@hyun-yang
Copy link
Author

hyun-yang commented May 12, 2018

@tafsiri
Just uploaded demo project https://github.com/hyun-yang/tfjsionicdemo

@musfandi
Copy link

hii, Any progress in this problem

@JulianDietz
Copy link

is there any workaround ?

@gabrielglbh
Copy link

gabrielglbh commented Jan 17, 2019

As far as I can read in this thread and in the referenced ones, loading the model from a local folder in ionic (as @hyun-yang mentioned) is still not supported when launching onto the device, but when running on localhost, it is.

Is there any progress in this issue or if it is solved, could you explain how to do it?

@caisq
Copy link
Contributor

caisq commented Feb 12, 2019

@gabrielglbh loadModel in node.js version of TF.js supports file:// URLs. Does that work for Ionic by any chance?

@gabrielglbh
Copy link

gabrielglbh commented Feb 13, 2019

@caisq When installing tfjs-node v0.3.0 in the project I get this error: Cannot find module './tfjs_binding'. So I cannot test wether it works or not in Ionic 3 with the node version of tfjs. When solving this issue, I will notify here if it works.

@gabrielglbh
Copy link

gabrielglbh commented Feb 21, 2019

Following the above comment and as an update, I've tried to import tfjs-node 0.3.0 into my ionic 3 app with no accomplishments:

import * as tf from '@tensorflow/tfjs; require('@tensorflow/tfjs-node'); const tfModel = tf.loadModel('/assets/model.json');

The code above gives me the following error: 'The "original" argument must be of type function'. It appears to be caused when adding the line of require();.

I've also tried to import all alone the tfjs-node version:

import * as tf from '@tensorflow/tfjs-node'; const tfModel = tf.loadModel('/assets/model.json');

But the following error appears: 'Cannot find module './tfjs_binding'.

So am I doing something wrong when using the tfjs-node version? Or is Ionic still not supported to load models from local files on device using tfjs?

@caisq
Copy link
Contributor

caisq commented Feb 21, 2019 via email

@gabrielglbh
Copy link

gabrielglbh commented Mar 17, 2019

Well, after trying for 2 weeks with different versions of tfjs-node and tfjs, loading a model from the local folder of assets, it is not compatible with ionic & angular when deployed on a device.

Latest version I've tried:

  • tfjs -- 1.0.0
  • tfjs-node -- 1.0.1

The error I am getting on my device when trying to load up the model is:
Based on the provided shape, [3,3,32,32], the tensor should have 9216 values but has 1119. I found out that this error is due to not loading up correctly the weightManifest files from local, meaning the model is loading correctly but for some reasons I am not aware of, the weights are not. Note: The shards are located on the same folder as the model.

@b-lack
Copy link

b-lack commented Mar 22, 2019

TensorflowJs loads the models via fetch(). fetch() does not support loading local-files. https://fetch.spec.whatwg.org/

To make this happen, I used this workaround in a Cordova-Projekt:

Import a polyfill (https://github.com/github/fetch) and replace the global-fetch.

window.fetch = fetchPolyfill;

Now, it's possible to load local files (file:///) like:

const modelUrl = './model.json'

const model = await tf.loadGraphModel(modelUrl);

@gabrielglbh
Copy link

gabrielglbh commented Mar 23, 2019

@b-lack Thank you for the workaround. I have tried it, but I still cannot get it right. Looking on the links and documentation provided, I have done the following:

import * as tf from '@tensorflow/tfjs';
import { fetch as fetchPolyfill } from 'whatwg-fetch';
constructor(){
    window.fetch = fetchPolyfill;

    modelJSON = '/assets/model.json';
    const model = await tf.loadLayersModel(modelJSON);
}

When deploying on my android device using Cordova, I get the same error Based on the provided shape, [3,3,32,32], the tensor should have 9216 values but has 1119.

I have tried also loading the model with tf.loadGraphModel() but then I get the following error: Uncaught TypeError: Cannot read property 'producer' of undefined. Looked it up on the issue #1432 and it seems the correct way to load a model from keras (as mine) with tfjs is with tf.loadLayersModel().

So I am wondering if the import of the polyfill is wrong or something, as the weight shards of the model still don't load correctly even with the global-fetch overwritten.

@b-lack
Copy link

b-lack commented Mar 26, 2019

@gabrielglbh I don't know if your error message is related to the import. I tried this Polyfill only with GraphModels.

For LayerModels I would recommend to load the model once from a server (https:// ...) and save it locally in localstorage or indexeddb: https://www.tensorflow.org/js/guide/save_load

const model = await tf.loadLayersModel('https://foo.bar/tfjs_artifacts/model.json');
const saveResult = await model.save('localstorage://my-model-1');

From then on, you can load it from localstorage/db

const model = await tf.loadLayersModel('localstorage://my-model-1');

@lxieyang
Copy link

lxieyang commented Jun 5, 2019

Any follow up on this issue? @gabrielglbh
Thanks!

@msektrier
Copy link

TensorflowJs loads the models via fetch(). fetch() does not support loading local-files. https://fetch.spec.whatwg.org/

To make this happen, I used this workaround in a Cordova-Projekt:

Import a polyfill (https://github.com/github/fetch) and replace the global-fetch.

window.fetch = fetchPolyfill;

Now, it's possible to load local files (file:///) like:

const modelUrl = './model.json'

const model = await tf.loadGraphModel(modelUrl);

I consider this as very important.
However, I want to add some information here.

For running on devices you need the version from the release section https://github.com/github/fetch/releases

Once integrated with

<script type="text/javascript" src="cordova.js"></script>
<script type="text/javascript" src="js/fetch.umd.js"></script>

you can overwrite window.fetch directly when the app initalizes with:

initialize: function() {
        window.fetch = WHATWGFetch.fetch;
        document.addEventListener('deviceready', this.onDeviceReady.bind(this), false);
    }

and load the model via

async function loadModel() {
	try {
	    tfModelCache = await tf.loadGraphModel('model.json');
	    return tfModelCache
	} catch (err) {
	  console.log(err)
	}
}

Now I am having the problem that path to the shards in model.json is no more correct. However, at least the fetch method is working.

nsthorat pushed a commit that referenced this issue Aug 19, 2019
…272)

release.

This exception happens because we tag in the CPU tfjs-node release
script.
```
Error: Could not git tag with v1.2.3: Command failed: git tag v1.2.3
fatal: tag 'v1.2.3' already exists
```
@rthadur
Copy link
Contributor

rthadur commented Mar 6, 2020

Automatically closing due to lack of recent activity. Please update the issue when new information becomes available, and we will reopen the issue. Thanks!

@rthadur rthadur closed this as completed Mar 6, 2020
@aliyoung
Copy link

1, rename model file [group1-shard1of1] to [group1-shard1of1.bin].

2, define custom fetch function:
function customFetchFunc(url, o) {
if (url === 'model/group1-shard1of1') {
return self.fetch(${url}.bin, o);
}
return self.fetch(url, o);
}

3, set fetchFunc
const nsfwModel = await tf.loadLayersModel('models/mobilenet_v2/model.json', { fetchFunc: customFetchFunc });

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:layers P2 type:feature New feature or request
Projects
None yet
Development

No branches or pull requests