Fix xFail errors from pytest #6940
Conversation
…gAccessError" This reverts commit 4942170.
… since mock decorator is not well working with pytest
.github/workflows/ci-test.yml
Outdated
| shell: bash | ||
| run: | | ||
| pytest -m "${{ matrix.which-tests }}" | ||
| pytest -vv -m "${{ matrix.which-tests }}" tfx |
There was a problem hiding this comment.
We had specifically not chosen to put verbose output originally because there often could be an overwhelming amount of output on GitHub and developers can run pytest with verbose output on their local machines instead. If you would really like verbose output, leave it, by all means.
CC: @peytondmurray
There was a problem hiding this comment.
Also, is specifying the tfx directory strictly necessary here? I believe it is already specified in the pytest config in pyproject.toml
There was a problem hiding this comment.
I've reverted this change.
| from tfx.experimental.distributed_inference.graphdef_experiments.subgraph_partitioning import create_complex_graph | ||
| from tfx.experimental.distributed_inference.graphdef_experiments.subgraph_partitioning import graph_partition | ||
|
|
||
| tf.compat.v1.enable_eager_execution() # Re-enable eager mode |
There was a problem hiding this comment.
If we need eager execution mode for some tests, my strong preference and suggestion is to use a pytest yield fixture as a decorator to enable and disable it for the tests that need it. The fixture can be stored in a conftest.py and reused anywhere it is needed.
CC: @peytondmurray
There was a problem hiding this comment.
☝️ To add context to this point:
tfxhas a hard dependency ontensorflow>=2.15, <2.16. Eager execution is enabled by default fortensorflow>=2, so I think this may not be necessary?- This setting unilaterally applies global changes to the testing environment, impacting much more than just the tests here. Even if
tfxdid allow users to installtensorflow<2, I'd still vote to move this into a test fixture to limit the impact on all the other tests in the test run.
On a related note, how many folks are still running TF<2? TF 2.0 came out over 5 years ago - if tfx still has tests that target deprecated 1.0 functionality, it may be worth starting an issue/discussion to remove them, as I don't think tfx can't be expected to run these at this point.
There was a problem hiding this comment.
The issue stemmed from the create_complex_graph module, which forcibly disables eager execution upon initialization. I've added a pytest yield fixture to re-enable it when the tests using this module end.
There was a problem hiding this comment.
[pytest yield fixture](https://docs.pytest.org/en/stable/how-to/fixtures.html#yield-fixtures-recommended) is not working, since disable_eager_execution() isn't working as expected. This might be due to disable_eager_execution() being called during the collection phase of pytest, as it's also called during initialization. This code is from an older part of the project, and I'm unsure if it's safe to remove. However, since it's within experimental code, I'll comment it out for now.
| from tfx.experimental.distributed_inference.graphdef_experiments.subgraph_partitioning import graph_partition | ||
| from google.protobuf import text_format | ||
|
|
||
| tf.compat.v1.enable_eager_execution() # Re-enable eager mode |
There was a problem hiding this comment.
Uh oh!
There was an error while loading. Please reload this page.