-
Notifications
You must be signed in to change notification settings - Fork 4.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support Python/C++ interoperability (native Python modules) #701
Comments
Some related discussion here: https://groups.google.com/forum/#!searchin/bazel-discuss/pex/bazel-discuss/-DFOrVqK8aY/-7h4e69hfkEJ |
Recategorizing this as a feature request to support Python-C++ interop. I marked the other report that asked for that as a dupe. |
A workarounds I found: you can include pre-compiled |
The trick to building a .so that works as a python extension seems to be making it a |
I stumbled upon this one also recently. Please at least change the documentation or (ideally) increase the priority to get this fixed for good. Python libraries and binaries can and do depend on C libraries. This is currently not really easy to model in bazel. In the meantime it would also be nice to have this workaround documented somewhere. |
@MarkusTeufelberger we love contributions 😃 |
The problem is that we don't know how to fix it. We could export our internal model, but we're unsure if that's what people want. Our internal model is that all cc_library rules in deps of py_library rules get merged into a single .so at the py_binary level, together with all cc_library dependencies of py_extension rules, except the py_extension code itself, which gets compiled into a separate .so. It sounds awkward, and it is, but there doesn't seem to be any other principled approach to avoid ODR violations in the C++ code. |
+1 for exporting the internal model. My project is considering switching from bazel to Makefiles solely to avoid the ODR violations we are currently seeing when using bazel + clif + protobufs. |
See related discussion in #1475. |
I have a draft of a Starlark implementation of something like the internal Google model (all native deps are linked together into a statically-linked shared object; the modules themselves are dynamically linked to it). Would it be interesting to share this? Perhaps on rules_python? |
@quval It would certainly be interesting yes. :) |
OK, here goes: https://github.com/quval/rules_native_python/ (Edit: just pushed a revised version. Edit 2: seems to have some issues when using gold - will update again once resolved. Edit 3: seems to be all right now. Edit 4, 11/2022: just pushed a major update after the library got some real use.) |
The following implementation seems to work for me without issue, are there any deficiencies I'm overlooking? # py_native_library.bzl
def py_native_library(name, **kwargs):
native.cc_binary(
name = name + ".so",
linkshared = True,
**kwargs
)
native.py_library(
name = name,
data = [name + ".so"],
) Edit: # py_native_library.bzl
def py_native_library(name, **kwargs):
native.cc_binary(
name = name + ".so",
linkshared = True,
linkstatic = False,
**kwargs
)
native.py_library(
name = name,
data = [name + ".so"],
) Is more correct, covering the common case where more than one native extension is loaded into a python executable (and so shared dependencies are properly dynamically linked, rather than statically linked into each extension). |
Hi ulfjack, how does google handle multiple py_library with some common cc_library depends? For example, if I have two py_library both depends on some X cc_library. Initially we may use those two py_library seperately, but at some point people may use both of these two py_library. I think in this situation there is still ODR violations? |
Bazel macro py_test_module_list takes a `deps` argument, but completely ignores it instead of passes it to `native.py_test`. Fixing that as we are going to use deps of py_test_module_list in BUILD in later changes. cpp/BUILD.bazel depends on the broken behaviour: it deps-on a cc_library from a py_test, which isn't working, see upstream issue: bazelbuild/bazel#701. This is fixed by simply removing the (non-working) deps.
See #23676 for context. This is another attempt at that as I figured out what's going wrong in `bazel test`. Supersedes #24828. Now that there are Python 3.10 wheels for Ray 1.13 and this is no longer a blocker for supporting Python 3.10, I still want to make `bazel test //python/ray/tests/...` work for developing in a 3.10 env, and make it easier to add Python 3.10 tests to CI in future. The change contains three commits with rather descriptive commit message, which I repeat here: Pass deps to py_test in py_test_module_list Bazel macro py_test_module_list takes a `deps` argument, but completely ignores it instead of passes it to `native.py_test`. Fixing that as we are going to use deps of py_test_module_list in BUILD in later changes. cpp/BUILD.bazel depends on the broken behaviour: it deps-on a cc_library from a py_test, which isn't working, see upstream issue: bazelbuild/bazel#701. This is fixed by simply removing the (non-working) deps. Depend on conftest and data files in Python tests BUILD files Bazel requires that all the files used in a test run should be represented in the transitive dependencies specified for the test target. For py_test, it means srcs, deps and data. Bazel enforces this constraint by creating a "runfiles" directory, symbolic links files in the dependency closure and run the test in the "runfiles" directory, so that the test shouldn't see files not in the dependency graph. Unfortunately, the constraint does not apply for a large number of Python tests, due to pytest (>=3.9.0, <6.0) resolving these symbolic links during test collection and effectively "breaks out" of the runfiles tree. pytest >= 6.0 introduces a breaking change and removed the symbolic link resolving behaviour, see pytest pull request pytest-dev/pytest#6523 for more context. Currently, we are underspecifying dependencies in a lot of BUILD files and thus blocking us from updating to newer pytest (for Python 3.10 support). This change hopefully fixes all of them, and at least those in CI, by adding data or source dependencies (mostly for conftest.py-s) where needed. Bump pytest version from 5.4.3 to 7.0.1 We want at least pytest 6.2.5 for Python 3.10 support, but not past 7.1.0 since it drops Python 3.6 support (which Ray still supports), thus the version constraint is set to <7.1. Updating pytest, combined with earlier BUILD fixes, changed the ground truth of a few error message based unit test, these tests are updated to reflect the change. There are also two small drive-by changes for making test_traceback and test_cli pass under Python 3.10. These are discovered while debugging CI failures (on earlier Python) with a Python 3.10 install locally. Expect more such issues when adding Python 3.10 to CI.
See ray-project#23676 for context. This is another attempt at that as I figured out what's going wrong in `bazel test`. Supersedes ray-project#24828. Now that there are Python 3.10 wheels for Ray 1.13 and this is no longer a blocker for supporting Python 3.10, I still want to make `bazel test //python/ray/tests/...` work for developing in a 3.10 env, and make it easier to add Python 3.10 tests to CI in future. The change contains three commits with rather descriptive commit message, which I repeat here: Pass deps to py_test in py_test_module_list Bazel macro py_test_module_list takes a `deps` argument, but completely ignores it instead of passes it to `native.py_test`. Fixing that as we are going to use deps of py_test_module_list in BUILD in later changes. cpp/BUILD.bazel depends on the broken behaviour: it deps-on a cc_library from a py_test, which isn't working, see upstream issue: bazelbuild/bazel#701. This is fixed by simply removing the (non-working) deps. Depend on conftest and data files in Python tests BUILD files Bazel requires that all the files used in a test run should be represented in the transitive dependencies specified for the test target. For py_test, it means srcs, deps and data. Bazel enforces this constraint by creating a "runfiles" directory, symbolic links files in the dependency closure and run the test in the "runfiles" directory, so that the test shouldn't see files not in the dependency graph. Unfortunately, the constraint does not apply for a large number of Python tests, due to pytest (>=3.9.0, <6.0) resolving these symbolic links during test collection and effectively "breaks out" of the runfiles tree. pytest >= 6.0 introduces a breaking change and removed the symbolic link resolving behaviour, see pytest pull request pytest-dev/pytest#6523 for more context. Currently, we are underspecifying dependencies in a lot of BUILD files and thus blocking us from updating to newer pytest (for Python 3.10 support). This change hopefully fixes all of them, and at least those in CI, by adding data or source dependencies (mostly for conftest.py-s) where needed. Bump pytest version from 5.4.3 to 7.0.1 We want at least pytest 6.2.5 for Python 3.10 support, but not past 7.1.0 since it drops Python 3.6 support (which Ray still supports), thus the version constraint is set to <7.1. Updating pytest, combined with earlier BUILD fixes, changed the ground truth of a few error message based unit test, these tests are updated to reflect the change. There are also two small drive-by changes for making test_traceback and test_cli pass under Python 3.10. These are discovered while debugging CI failures (on earlier Python) with a Python 3.10 install locally. Expect more such issues when adding Python 3.10 to CI. Signed-off-by: Stefan van der Kleij <s.vanderkleij@viroteq.com>
@mfarrugi thanks for your example. This works when I am using pybind11. Meaning if I want to call C++ functions via python this works. But how about the other way around call python functions from C++. Can I do this then: def cc_native_library(name, py_srcs, py_deps, cc_srcs, cc_deps, **kwargs):
native.py_library(
name = name + ".py",
srcs = py_srcs,
deps = py_deps,
**kwargs
)
native.cc_library(
name = name,
srcs = cc_srcs,
deps = cc_deps,
data = [name + ".py"],
linkshared = True,
linkstatic = False,
deps = [":python3"], # depend on python3 path location
) I want to use the cc_native_library for a cc_binary. |
Thank you for contributing to the Bazel repository! This issue has been marked as stale since it has not had any activity in the last 1+ years. It will be closed in the next 90 days unless any other activity occurs. If you think this issue is still relevant and should stay open, please post any comment here and the issue will no longer be marked as stale. |
This issue has been automatically closed due to inactivity. If you're still interested in pursuing this, please post |
@bazelbuild/triage |
According to the documentation for
py_library.deps
:However, when I try to build a
py_library
target that depends on acc_library
, I get the following error:According to
BazelPyRuleClasses.java
, the only allowed rule types inpy_library.deps
is, in fact,py_binary
andpy_library
:What is the plan for Python-C interoperability? Should we update the documentation to remove the mention of depending on
cc_library
in the meantime?The background for this is that I am trying to use Jinja for generating HTML for the Skylark docgen tool that I am working on. However, Jinja depends on Markupsafe, which is partly implemented in C.
The text was updated successfully, but these errors were encountered: