Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request: use Hub modules with Eager execution enabled #124

Closed
nathanin opened this issue Aug 1, 2018 · 23 comments
Closed

Feature request: use Hub modules with Eager execution enabled #124

nathanin opened this issue Aug 1, 2018 · 23 comments
Assignees
Labels
hub For all issues related to tf hub library and tf hub tutorials or examples posted by hub team stat:awaiting tensorflower type:feature

Comments

@nathanin
Copy link

nathanin commented Aug 1, 2018

Hi, I guess this is best classified as a support request. I'm trying to use a Hub resnet image model in a transfer learning application. Most of my project takes advantage of tensorflow's eager execution mode. When I add in the image module, I am met with the following error, below, at the line that should download and import the module (is inside of a tfe.Network.__init__() method.).

The main message is pretty straightforward: that no graph exists for eager mode therefore the Hub module graph cannot be imported. Is there a way around this.. or will I have to dump the variables to a npy and use them to initialize eager tensors?

Traceback (most recent call last):
  File "run.py", line 170, in <module>
    main(train_list, val_list, test_list, Model, loss_function, arch_dir, fold_num)
  File "run.py", line 75, in main
    model = Model()
  File "/home/.../script.py", line 32, in __init__
    self.resnet = hub.Module(MODULE_URL, trainable=True, tags={'train'})
  File "/home/ing/envs/tensorflow/local/lib/python2.7/site-packages/tensorflow_hub/module.py", line 126, in __init__
    tags=self._tags)
  File "/home/ing/envs/tensorflow/local/lib/python2.7/site-packages/tensorflow_hub/native_module.py", line 282, in _create_impl
    name=name)
  File "/home/ing/envs/tensorflow/local/lib/python2.7/site-packages/tensorflow_hub/native_module.py", line 333, in __init__
    variable_tensor_map, self._state_map = self._create_state_graph(name)
  File "/home/ing/envs/tensorflow/local/lib/python2.7/site-packages/tensorflow_hub/native_module.py", line 383, in _create_state_graph
    restore_collections_predicate=(lambda key: key in import_collections))
  File "/home/ing/envs/tensorflow/local/lib/python2.7/site-packages/tensorflow/python/training/saver.py", line 1943, in import_meta_graph
    raise RuntimeError("Exporting/importing meta graphs is not supported when "
RuntimeError: Exporting/importing meta graphs is not supported when eager execution is enabled. No graph exists when eager execution is enabled.

Some information about my environment:

$ pip freeze | grep tensorflow
tensorflow==1.8.0
tensorflow-gpu==1.7.0
tensorflow-hub==0.1.0
tensorflow-tensorboard==1.5.0

$ python --version
Python 2.7.12
@arnoegw
Copy link
Contributor

arnoegw commented Aug 3, 2018

Hi Nathan, using Hub modules from Eager mode would be a great, but regrettably it is not possible at this point in time. I'll keep this issue open for the feature request.

@arnoegw arnoegw changed the title Impossible to use Hub modules with Eager execution enabled? Feature request: use Hub modules with Eager execution enabled Aug 3, 2018
@andresusanopinto andresusanopinto added the type:enhancement New feature or request label Aug 6, 2018
@mynameisguy
Copy link

+1 for the request

3 similar comments
@wshuyi
Copy link

wshuyi commented Sep 27, 2018

+1 for the request

@huan
Copy link

huan commented Oct 5, 2018

+1 for the request

@pprasertsak
Copy link

+1 for the request

@gokul-uf
Copy link

gokul-uf commented Oct 19, 2018

@arnoegw considering that Eager would be the default execution mode from TF 2 and upwards, wouldn't this feature request be more than just a "nice thing to have" ? I'm curious if there any timeline for eager support.

@arnoegw
Copy link
Contributor

arnoegw commented Oct 19, 2018

@gokul-uf: Agreed. We are working on TF2 support, including Eager mode, in time for the TF2.0 release (not preview).

@gokul-uf
Copy link

Thanks for the update @arnoegw For those who can't wait until until TF2, I believe tfe.py_func might be a nice intermediate. Hacky, yes. But it works. See here for an example

@Selimonder
Copy link

+1 for this

@Harshini-Gadige Harshini-Gadige added hub For all issues related to tf hub library and tf hub tutorials or examples posted by hub team type:feature and removed type:enhancement New feature or request labels Mar 14, 2019
@leemengtw
Copy link

Looking forward for this.

I'm a early TF 2.0 adapter but can't figure how to use TF Hub within TF 2.0 preview version.

@arnoegw
Copy link
Contributor

arnoegw commented Mar 25, 2019

@leemengtaiwan: For guiding examples, please see
https://colab.research.google.com/github/tensorflow/hub/blob/master/examples/colab/tf2_text_classification.ipynb
https://colab.research.google.com/github/tensorflow/hub/blob/master/examples/colab/tf2_image_retraining.ipynb

@rmothukuru
Copy link

Closing this Feature Request as we can use Hub Modules with Eager Execution Enabled from Tensorflow 2.0 Alpha version on wards. Please feel free to reopen this issue if your requirement is not implemented.

@Aashish-1008
Copy link

+1 for this

@amitness
Copy link

amitness commented Jul 6, 2019

+1

@shishirmane
Copy link

I checked with the recently released tensorflow 2.0.0 (stable), the issue is still persisting.

I tried it for accessing the ELMo vectors using the following tensorflow-hub line:
elmo_model = hub.Module("https://tfhub.dev/google/elmo/2", trainable=True)

Exception:
RuntimeError: Exporting/importing meta graphs is not supported when eager execution is enabled. No graph exists when eager execution is enabled.

@arnoegw
Copy link
Contributor

arnoegw commented Oct 2, 2019

Please use

elmo_model = hub.load("https://tfhub.dev/google/elmo/2")
out = elmo_model.signatures["default"](in)

...optionally with a tf.GradientTape() around the call.

In TF2, hub.load() is the right API to use for both TF1-style Hub modules and TF2-style SavedModels. All it does is fetch the model contents into TFHUB_CACHE_DIR and then call tf.saved_model.load() from there.

Moe detailed documentation on TF Hub will be forthcoming. For TF2-style SavedModels in general, please see https://www.tensorflow.org/guide/saved_model

@federicoruggeri
Copy link

federicoruggeri commented Oct 7, 2019

Hello,
I've just tried the following:

import tensorflow as tf
import tensorflow_hub as hub

elmo = hub.load('https://tfhub.dev/google/elmo/2')
test_input = tf.constant(['the cat is on the table'], dtype=tf.string)

with tf.GradientTape() as tape:
    result = elmo.signatures['default'](test_input)

And it throws AttributeError: 'NoneType' object has no attribute 'outer_context' when calling the loaded ELMo model. On the other hand, removing the tf.GradientTape() wrap up, the code works. Am I missing something?

elmo = hub.load('https://tfhub.dev/google/elmo/2')
test_input = tf.constant(['the cat is on the table'], dtype=tf.string)
result = elmo.signatures['default'](test_input)

Tensorflow config:
tensorflow-estimator==2.0.0
tensorflow-gpu==2.0.0
tensorflow-hub==0.6.0

Python config:
Python 3.5.3

EDIT

I've also tested BERT hub model without reporting any error. I don't know if the problem is exclusively attributable to the ELMo hub model then. Anyone got it working?

@rsuwaileh
Copy link

rsuwaileh commented Oct 26, 2019

+1 for the request
I tried reading BERT vectors using the following line:
bert_module = hub.Module(bert_path)
but got RuntimeError: Exporting/importing meta graphs is not supported when eager execution is enabled. No graph exists when eager execution is enabled.

config:
Python 3.6.9
tensorflow-2.0.0
bert-tensorflow-1.0.1
tensorflow-hub-0.6.0

@Carolyn95
Copy link

Carolyn95 commented Nov 5, 2019

+1 for the request.
I tried load USE module using these lines:

use_module = hub.load("https://tfhub.dev/google/universal-sentence-encoder-large/3")
def embed_use(self, x):
        return self.use_module(tf.reshape(tf.cast(x, 'string'), [-1]), signature='default', as_dict=True)['default']

but got TypeError: 'AutoTrackable' object is not callable

my config:

  • Python == 3.7.2
  • tensorflow_hub == 0.7.0
  • tensorflow == 2.0.0

@rmothukuru rmothukuru reopened this Nov 5, 2019
@arnoegw
Copy link
Contributor

arnoegw commented Nov 13, 2019

@ReemSuwaileh: For TF2 versions of BERT, please see the new models https://tfhub.dev/s?publisher=tensorflow&q=bert published from the reimplementation of BERT from tensorflow/models.

@arnoegw
Copy link
Contributor

arnoegw commented Nov 13, 2019

@Carolyn95: To get the TF2 version of google/universal-sentence-encoder-large, please upgrade to https://tfhub.dev/google/universal-sentence-encoder-large/4

For using old models in the hub.Module format with hub.load(), please see https://www.tensorflow.org/hub/migration_tf2 and notice how the signature is picked out from a dict (not passed as an argument).

@arnoegw
Copy link
Contributor

arnoegw commented Nov 13, 2019

@federicoruggeri: Thanks for reporting the problem about tf.GradientTape and the elmo model. This is important but distinct enough to be tracked separately. I filed it as #416

@arnoegw
Copy link
Contributor

arnoegw commented Nov 13, 2019

The original feature request has been fulfilled. Please file separate issues if you encounter any problems besides the ones addressed above. Thank you.

@arnoegw arnoegw closed this as completed Nov 13, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
hub For all issues related to tf hub library and tf hub tutorials or examples posted by hub team stat:awaiting tensorflower type:feature
Projects
None yet
Development

No branches or pull requests