Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WARNING:tensorflow:Entity <function <lambda> at 0x000002343DCF24C8> could not be transformed and will be executed as-is. #38691

Closed
laghaout opened this issue Apr 19, 2020 · 15 comments
Assignees
Labels
comp:autograph Autograph related issues TF 2.0 Issues relating to TensorFlow 2.0 type:bug Bug

Comments

@laghaout
Copy link

What can be done to solve this warning?

Warning message

Below is the full warning:

WARNING:tensorflow:Entity <function at 0x000002343DCF24C8> could not be transformed and will be executed as-is. Please report this to the AutoGraph team. When filing the bug, set the verbosity to 10 (on Linux, export AUTOGRAPH_VERBOSITY=10) and attach the full output. Cause: module 'gast' has no attribute 'Str'
WARNING: Entity <function at 0x000002343DCF24C8> could not be transformed and will be executed as-is. Please report this to the AutoGraph team. When filing the bug, set the verbosity to 10 (on Linux, export AUTOGRAPH_VERBOSITY=10) and attach the full output. Cause: module 'gast' has no attribute 'Str'

Minimum working example

The code that generated the warning was discussed in #38471 and stored in this gist.

import tensorflow as tf

def compute_length(x): 
    return tf.strings.length(x)

def check_substring(x, substring):
    return tf.strings.regex_full_match(x,substring)


def compute_palindrome(x):
    extra_split = tf.strings.bytes_split(x)
    reverse = tf.reverse(extra_split,[0])
    reversedStr = tf.strings.reduce_join([reverse])
    return reversedStr
    
ds = tf.data.Dataset.from_tensor_slices(["Ottawa", "Stockholm", "Rabat"])

ds = ds.map(
    lambda city: (city, 
                  compute_length(city), 
                  check_substring(city, ".*lm.*"),
                  compute_palindrome(city),
                  ),
        )

num_elems = len(ds.element_spec)
for elem in ds:
   print(''.join([f"{elem[i]}" for i in range(num_elems)]))

Environment

  • python 3.7.4
  • tensorflow-gpu 2.0.0
  • tensorflow-datasets 1.3.0
  • gast 0.2.2

Running on Windows 10 under conda 4.8.3.

@Saduf2019
Copy link
Contributor

@laghaout
i ran the code shared on tf_nightly and do not find any error, please find the gist here as well on 2.0

could you please refer to these issues and let us know if it helps:
link1 link2 link3 link4 link5

@Saduf2019 Saduf2019 added TF 2.0 Issues relating to TensorFlow 2.0 comp:autograph Autograph related issues stat:awaiting response Status - Awaiting response from author labels Apr 20, 2020
@laghaout
Copy link
Author

@Saduf2019, I read the issues you linked to but I'm afraid none of them solve the problem. While the code seems to run without warnings on Colab, the warnings appear on two different machines I've tested it on (Windows 10 and Debian).

The exact versions of the packages are as follows:

user@user-G751J ~
$ conda list | grep -i gast
gast                      0.2.2                    pypi_0    pypi

user@user-G751J ~
$ conda list | grep -i tensorflow
tensorflow-datasets       1.3.0                    pypi_0    pypi
tensorflow-estimator      2.0.1                    pypi_0    pypi
tensorflow-gpu            2.0.0                    pypi_0    pypi
tensorflow-hub            0.6.0                    pypi_0    pypi
tensorflow-metadata       0.15.0                   pypi_0    pypi

@laghaout
Copy link
Author

Please note that the problem persists even with the following upgrades:

gast                      0.3.3                    pypi_0    pypi
tensorflow-estimator      2.0.1                    pypi_0    pypi
tensorflow-gpu            2.1.0                    pypi_0    pypi
tensorflow-gpu-estimator  2.1.0                    pypi_0    pypi

@Saduf2019 Saduf2019 removed the stat:awaiting response Status - Awaiting response from author label Apr 22, 2020
@jvishnuvardhan
Copy link
Contributor

@laghaout Can you please test with tf-nightly or TF2.2.0rc3 and let us know whether the error persists with latest versions also. Thanks!

@mdanatg
Copy link

mdanatg commented Apr 22, 2020

The following versions should work (note that pip install should pull the correct dependencies in TF 2.2 and later):

TF version gast version
2.0, 2.1, 2.2 0.2.2
2.3, nightly 0.3.3

To make sure you have the correct version installed, please try force-installing it: pip install gast==<version> --force-reinstall.

@laghaout
Copy link
Author

@mdanatg, @jvishnuvardhan: I followed your suggestions, but just to double-check, my packages are

tensorflow-datasets       1.3.0                    pypi_0    pypi
tensorflow-estimator      2.0.1                    pypi_0    pypi
tensorflow-gpu            2.1.0                    pypi_0    pypi
tensorflow-gpu-estimator  2.1.0                    pypi_0    pypi
tensorflow-hub            0.6.0                    pypi_0    pypi
tensorflow-metadata       0.15.0                   pypi_0    pypi

and

gast                      0.2.2                    pypi_0    pypi

However, the mystery has now thickened: The message does not show up when the code is run in Jupyter, but persists when the code is run in Spyder. In the latter case, it is only when the code is re-run, i.e., the warning does not show the first time it is run.

Furthermore, the warning is still the same as that I posted initially but it has a slightly different wording:

WARNING:tensorflow:AutoGraph could not transform <function <lambda> at 0x000002E2EC2049D8> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output.
Cause: 
WARNING: AutoGraph could not transform <function <lambda> at 0x000002E2EC2049D8> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output.
Cause: 
b'Ottawa'6Falseb'awattO'
b'Stockholm'9Trueb'mlohkcotS'
b'Rabat'5Falseb'tabaR'

@mdanatg
Copy link

mdanatg commented Apr 23, 2020

Thanks for checking. So long as you don't see any module 'gast' has no attribute "foo", then the version of gast is correct. So we must look for a different cause.

To get a bit more information, could you set this env var: export AUTOGRAPH_VERBOSITY=3 and replicate the warning messages? That should generate a lot of log output that will be useful to tell what's going on.

@laghaout
Copy link
Author

@mdanatg, please find the log here.

@mdanatg
Copy link

mdanatg commented Apr 24, 2020

@laghaout Thank you. The log doesn't seem to contain any errors - did the warning appear while collecting it?

@laghaout
Copy link
Author

@mdanatg I had forgotten that the warning does not show up on the first run. Here it is on the second run.

@mdanatg
Copy link

mdanatg commented Apr 24, 2020

Thank you, that helped! It seems that Spyder may decide to unload modules. Will send a fix.

@mdanatg
Copy link

mdanatg commented Apr 24, 2020

Actually, this should already be fixed in tf-nightly. The fix will be released in TF 2.3.

Closing the issue for now, but please re-open if you still see the warnings with tf-nightly.

@mdanatg mdanatg closed this as completed Apr 24, 2020
@google-ml-butler
Copy link

Are you satisfied with the resolution of your issue?
Yes
No

@ssnirgudkar
Copy link

Hello,
I am witnessing identical issue with a very simple command. In fact, it is one of the demonstrated usages of the API.

Linux : Ubuntu 18.04
I am running docker container. Inside the docker container,
TensorFlow version : 1.15
Python version : 3.6.9
gast version : 0.2.2 ( checked with command 'pip freeze')

Command:
d = tf.data.Dataset.from_tensor_slices([1, 2, 3]) ( I was looking at this example )
d = d.filter(lambda x: x < 3)

WARNING:tensorflow:Entity <function at 0x7ff0f804fb70> could not be transformed and will be executed as-is. Please report this to the AutoGraph team. When filing the bug, set the verbosity to 10 (on Linux, export AUTOGRAPH_VERBOSITY=10) and attach the full output. Cause: Unable to locate the source code of <function at 0x7ff0f804fb70>. Note that functions defined in certain environments, like the interactive Python shell do not expose their source code. If that is the case, you should to define them in a .py source file. If you are certain the code is graph-compatible, wrap the call using @tf.autograph.do_not_convert. Original error: could not get source code
WARNING: Entity <function at 0x7ff0f804fb70> could not be transformed and will be executed as-is. Please report this to the AutoGraph team. When filing the bug, set the verbosity to 10 (on Linux, export AUTOGRAPH_VERBOSITY=10) and attach the full output. Cause: Unable to locate the source code of <function at 0x7ff0f804fb70>. Note that functions defined in certain environments, like the interactive Python shell do not expose their source code. If that is the case, you should to define them in a .py source file. If you are certain the code is graph-compatible, wrap the call using @tf.autograph.do_not_convert. Original error: could not get source code

Any idea why I am seeing this problem and how to fix it? Unfortunately, I cannot verify it in TF 2.X because I cannot install it easily on this machine as it needs other dependencies to be correctly (CUDA > 11.0) installed.

@mdanatg
Copy link

mdanatg commented Dec 28, 2020

Hi @ssnirgudkar, it appears that the error message indicates an environment related incompatibility. The root cause it indicated by this piece in the message: "Original error: could not get source code". This section in the docs has a few details why.

Given that the lambda function is simple, you could try to wrap it into a tf.autograph.experimental.do_not_convert, but it is likely to see that error for all the other functions that might be involved. That type of error will also happen if you upgrade to a newer version of TF, so I think it's worth looking at the environment instead.

Are you getting this error when trying to run the code in an interactive Python or IPython shell? I know those to be incompatible. Running the code as a Python file, or using something like Jupyter should work though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:autograph Autograph related issues TF 2.0 Issues relating to TensorFlow 2.0 type:bug Bug
Projects
None yet
Development

No branches or pull requests

5 participants