New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature request: use Hub modules with Eager execution enabled #124
Comments
Hi Nathan, using Hub modules from Eager mode would be a great, but regrettably it is not possible at this point in time. I'll keep this issue open for the feature request. |
+1 for the request |
3 similar comments
+1 for the request |
+1 for the request |
+1 for the request |
@arnoegw considering that Eager would be the default execution mode from TF 2 and upwards, wouldn't this feature request be more than just a "nice thing to have" ? I'm curious if there any timeline for eager support. |
@gokul-uf: Agreed. We are working on TF2 support, including Eager mode, in time for the TF2.0 release (not preview). |
+1 for this |
Looking forward for this. I'm a early TF 2.0 adapter but can't figure how to use TF Hub within TF 2.0 preview version. |
Closing this Feature Request as we can use Hub Modules with Eager Execution Enabled from Tensorflow 2.0 Alpha version on wards. Please feel free to reopen this issue if your requirement is not implemented. |
+1 for this |
+1 |
I checked with the recently released tensorflow 2.0.0 (stable), the issue is still persisting. I tried it for accessing the ELMo vectors using the following tensorflow-hub line: Exception: |
Please use
...optionally with a tf.GradientTape() around the call. In TF2, Moe detailed documentation on TF Hub will be forthcoming. For TF2-style SavedModels in general, please see https://www.tensorflow.org/guide/saved_model |
Hello,
And it throws AttributeError: 'NoneType' object has no attribute 'outer_context' when calling the loaded ELMo model. On the other hand, removing the
Tensorflow config: Python config: EDIT I've also tested BERT hub model without reporting any error. I don't know if the problem is exclusively attributable to the ELMo hub model then. Anyone got it working? |
+1 for the request config: |
+1 for the request. use_module = hub.load("https://tfhub.dev/google/universal-sentence-encoder-large/3")
def embed_use(self, x):
return self.use_module(tf.reshape(tf.cast(x, 'string'), [-1]), signature='default', as_dict=True)['default'] but got my config:
|
@ReemSuwaileh: For TF2 versions of BERT, please see the new models https://tfhub.dev/s?publisher=tensorflow&q=bert published from the reimplementation of BERT from tensorflow/models. |
@Carolyn95: To get the TF2 version of google/universal-sentence-encoder-large, please upgrade to https://tfhub.dev/google/universal-sentence-encoder-large/4 For using old models in the hub.Module format with hub.load(), please see https://www.tensorflow.org/hub/migration_tf2 and notice how the signature is picked out from a dict (not passed as an argument). |
@federicoruggeri: Thanks for reporting the problem about tf.GradientTape and the elmo model. This is important but distinct enough to be tracked separately. I filed it as #416 |
The original feature request has been fulfilled. Please file separate issues if you encounter any problems besides the ones addressed above. Thank you. |
Hi, I guess this is best classified as a support request. I'm trying to use a Hub resnet image model in a transfer learning application. Most of my project takes advantage of tensorflow's eager execution mode. When I add in the image module, I am met with the following error, below, at the line that should download and import the module (is inside of a
tfe.Network.__init__()
method.).The main message is pretty straightforward: that no graph exists for eager mode therefore the Hub module graph cannot be imported. Is there a way around this.. or will I have to dump the variables to a
npy
and use them to initialize eager tensors?Some information about my environment:
The text was updated successfully, but these errors were encountered: