-
Notifications
You must be signed in to change notification settings - Fork 74k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
//tensorflow/tools/api/tests:api_compatibility_test fails with protobuf > 3.20.3 #59932
Comments
Hi @elfringham , I tried the same commands but my build is success and test also passes.Please refer to attached log below.Please check and confirm if any deviation.Please note I have not installed any protobuf specifically but gone with default from master branch.I observed 3.19.6 is default protobuf version with Pip install and same might be considered by Bazel. Hence for my case there is no error right?
|
|
Actually earlier I did not downloaded any protobuf version explicitly. I tried to test the command by installing latest protobuf version (4.22.1) now and the test is failing.The log is attached below for reference. |
@SuryanarayanaY You are missing an install of keras and tensorflow-estimator. Or alternatively keras-nightly and tf-estimator-nightly. |
@elfringham , |
I am unable to replicate the reported problem.For me build is success and test also passed. I am attaching the logs for your review. Please have a look and confirm if anything missing. Thanks! |
Hi @SuryanarayanaY it is hard to see what is the difference in our environments that is causing your test to pass. Your use of the 'pip' command masks which version of Python is used and that may not be the same as the Python that is picked up by the TensorFlow test build. Also as you have not run 'configure' the log does not show which Python is in use. Also there may be environment variables such as PYTHON_BIN_PATH or PYTHON_LIB_PATH that affects things. |
Thanks @elfringham ,I Will definitely wait for your gist. |
api_compat_test_fail.zip |
You can see the gist here https://colab.research.google.com/drive/1r8ssFOenTKYlWf0Ca00y2ffg8ifq4ntN?usp=sharing in case you can run it yourself with more resource privileges. |
I too don't have privilege access to google colab.It will time out before completion. I can test only in VM for now. I am surprised that with same environment I replicated the issue mentioned at #59931 but not here. Could you please refer the issue there and confirm whether any changes If you ask me I can test with python==3.9.16 explicitly using .configure step and will give it a try. |
@SuryanarayanaY I found the three issues, #59930 , #59931 and this one #59932 all at the same time with the same environment. So I too am surprised that you see different behaviour between #59931 and here. For me they all pass if I install protobuf==3.20.3 and all fail with protobuf > 3.20.3. I only entered three separate issues as they all have different errors. |
I have used same build that has been used for It shows Python 3.9.16 being used for running tests.But I noticed one test being skipped automatically. Please check the tests(6/7) that are executed.
After above log , I observed in the below immediate log where one of the test being skipped.
The same test being skipped in earlier build also where I have attached logs in comment-1469880441. Could you please cross check whether the above skipped test in your environment also skipped or executed? Thanks! |
I see the same skipped test. The failures I see are in testAPIBackwardsCompatibility, testAPIBackwardsCompatibilityV1 and testAPIBackwardsCompatibilityV2. Please see the log I attached above.
|
Hi @vam-google, Could you please take a look? |
I have cross checked with r2.12 branch and here also the test passed for me.Please refer to attached logs below. |
I'm struggling to understand what is going on in all three bugs reported here. If during the tests it somehow got resolved to Currently TF bazel build is not hermetic in terms of python dependencies (we are working on fixing it), so whatever gets actually installed on your machine will be used by your TF build. In terms of compatibility. Honestly I don't know what those failing tests are testing for, but in general protobuf 3.18, 3.19, 3.20 and 4.21 are very different versions, and they represent a transition path to a non-compatible versions (4.21 is not compatible with 3.18). So in principle some breackage in terms of compatibility is logical here. One other important piece: we depend on protobuf in both phases: generaiton/compilation and runtime. This is important because generation time is what is responsible for creation of those |
Theses tests do pass the CI but that is because that is run using protobuf==3.20.3, see
Yes the tests using 3.19 are irrelevant, it is not supported. The unit tests that fail are being run in the source tree and so of course they are using the same protobuf for the build and the test. This is not testing a wheel installed into a separate environment. My point is that TensorFlow claims to be compatible with protobuf 4.x and yet it fails these tests when protobuf > 3.20.3 is installed.
|
Click to expand!
Issue Type
Bug
Have you reproduced the bug with TF nightly?
Yes
Source
source
Tensorflow Version
git HEAD
Custom Code
No
OS Platform and Distribution
Ubuntu 20.04
Mobile device
n/a
Python version
3.9.16
Bazel version
5.3.0
GCC/Compiler version
9.4.0
CUDA/cuDNN version
n/a
GPU model and memory
n/a
Current Behaviour?
The test fails
Standalone code to reproduce the issue
bazel test --test_timeout=300,500,-1,-1 --flaky_test_attempts=3 --test_output=all --cache_test_results=no --noremote_accept_cached --test_env=TF_ENABLE_ONEDNN_OPTS=1 --build_tag_filters=-no_oss,-oss_serial,-gpu,-tpu,-benchmark-test,-v1only,-requires-gpu --test_tag_filters=-no_oss,-oss_serial,-gpu,-tpu,-benchmark-test,-v1only,-requires-gpu --verbose_failures --build_tests_only --jobs=16 -- //tensorflow/tools/api/tests:api_compatibility_test
Relevant log output
The text was updated successfully, but these errors were encountered: