-
Notifications
You must be signed in to change notification settings - Fork 716
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The "standard" installation use pyproject.toml
in UV rather than dynamic dependencies via build hooks (comparing to PIP)
#2130
Comments
--editable
version of dependencies in UV (comparing to PIP)pyproject.toml
version of dependencies in UV (comparing to PIP)
pyproject.toml
version of dependencies in UV (comparing to PIP)pyproject.toml
in UV rather than dynamic deypendencies via build hooks (comparing to PIP)
pyproject.toml
in UV rather than dynamic deypendencies via build hooks (comparing to PIP)pyproject.toml
in UV rather than dynamic deypendencies via build hooks (comparing to PIP)
pyproject.toml
in UV rather than dynamic deypendencies via build hooks (comparing to PIP)pyproject.toml
in UV rather than dynamic deypendencies via build hooks (comparing to PIP)
pyproject.toml
in UV rather than dynamic deypendencies via build hooks (comparing to PIP)pyproject.toml
in UV rather than dynamic deypendencies via build hooks (comparing to PIP)
I have an update. I realized that the problem could be, because we were not really PEP 621 compliant with Airflow becaused we mixed "dynamic" and "static" project properties in Airflow. The However as of today we fixed it in `apache-airflow' in apache/airflow#38439 and the problem remains. Both "dependencies" and "optional-dependencies" are set as dynamic in
It looks like PIP caseWith
Output
Output
Ouput
UV caseThe same exercise with I run it with the latest as of today
This wrongly installs just airflow package and misses both - dynamic "dependenciess" and dynamic "optional-dependencies" that should be derived from running the build hook. Output
This correctly installs dynamically derived both "dependencies" and "optional-dependencies". See (expected) lack of "apache-airflow-providers-amazon" and presence of "moto" - both indicate that Output
Output
|
There's something happening here but I don't quite understand what yet. |
(We do run the PEP 517 build hooks in all cases here.) |
Just to add maybe a hypothesis/lead: It could be that this is somehow interacting with the way how the "hatch_build.py" interacts with hatchling backeng build. I had a bit hard time to guess how to properly update both dependencies and optional-dependencies in the hook. And I settled with this way (which works as I described above + it nicely generates the .whl packages): So what i settled with:
While the "optional-dependencies" might be somewhat hacky, the "dependencies" seem to be perfectly justified way. And they also nicely work for "editable" case. Hypothesis - maybe the version is not set to "standard" when you run the hook ? That could cause the behaviour observd. |
So, this only installs airflow:
But if I run again without clearing the cache...
Then I get all 150 dependencies. |
Yes. the build hook is run properly. Just run it with a little exception just before existing in # with hatchling, we can modify dependencies dynamically by modifying the build_data
build_data["dependencies"] = self._dependencies
raise Exception(build_data) <-- added this and it seems all good - at least for the clean run of:
result:
Still, when I remove the exception:
|
Hmm, it seems related to our use of |
Maybe I should set it differently in a) no idea how and at least "dependencies" seem to be the right way of doing it So it looks like something on the |
Editable with exception:
After removing the exception:
|
Perhaps we can ask Ofek if he has any ideas why the generated metadata is different here. |
cc: @ofek ? |
pyproject.toml
in UV rather than dynamic deypendencies via build hooks (comparing to PIP)pyproject.toml
in UV rather than dynamic dependencies via build hooks (comparing to PIP)
(I think the reason it works the second invocation that the wheel is built, and we read the metadata from the wheel if it's available. But in the first invocation, during resolution, we do |
I believe this is what |
And that might also explain why editable works - because you are doing it in PEP-660 compatible way (which also builds an "editable" wheel as an intermediary medium of metadaa) ? |
I'm pretty sure that the metadata returned by |
Hmmm. Then we need @ofek to chime-in :). BTW. Just to add the sense of I workarounded it in our CI image building proces. so I donwload and unpack the It works for |
Does it work if you explicitly gate by the value of this e.g. Both built-in targets
I'm just getting up to speed on the issue and don't immediately see what's happening. |
Oh, nevermind I see the issue I think: https://github.com/pypa/hatch/blob/hatchling-v1.22.4/backend/src/hatchling/build.py#L86-L99 Charlie, let me know how you would like me to fix. I think I have to account for your code base now as well. |
In short:
|
AAARGH... If |
@ofek -- I assume there's no way for uv or for Hatch to detect when the hook can be trusted vs. not? E.g., could uv just avoid calling this when requirements are declared as "dynamic"? |
That could work also I think. |
Is this unique to Hatch since it supports extensions? Would other build backends need similar gating? |
setuptools would have the same issue in theory for very dynamic builds. edit: really any such backend that allows for code execution during builds to modify metadata, maybe PDM also |
Ok, I'll add that gate in uv for now. |
Do you have any advice for writing a test for this? This is what I have, but it's passing when I want it to fail based on the above: charlie/proj...charlie/dynamic |
Does this work?
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[project]
name = "hatch-dynamic"
version = "1.0.0"
dynamic = ["dependencies"]
[tool.hatch.build.targets.wheel.hooks.custom]
from hatchling.builders.hooks.plugin.interface import BuildHookInterface
class LiteraryBuildHook(BuildHookInterface):
def initialize(self, version, build_data):
build_data['dependencies'].append('foo') |
Trying, thank you. |
Yes, thanks. |
It looks like the example use-case from the Hatch issue (pypa/hatch#532) doesn't use dynamic dependencies. I was hoping I could instead do something even more specific, like look for the presence of Hatch plugins ( |
So I'm not really sure how to detect it :/ |
Hmm, good point actually only metadata hooks have anything to do with If you want to check hooks then you would have to search for the following:
Alternatively (or in addition to), maybe it would be nice to indicate your usage with an environment variable and then pip perhaps could start doing the same thing to standardize. |
I'd find it worrying if a trend develops for frontends to special case for specific backends, or backends to special case for specific frontends. |
Agree. I think we have to find a solution that is generic in both directions. Maybe something to discuss in Pittsburgh :) |
Just a sanity check, does everyone here understand why I do this? https://github.com/pypa/hatch/blob/hatchling-v1.22.4/backend/src/hatchling/build.py#L86-L99 I'm open to feedback but basically for extremely dynamic metadata you don't want frontends caching like that before dynamic stuff happens i.e. this issue. |
I think I understand it, though I'm not familiar enough with Hatch to know why my first test case (using the I do think it's non-standards compliant though, in two ways, at least based on what I saw in the linked issue:
I'm honestly somewhat hesitant to add custom workarounds for Hatch here, though it also seems like a huge shame to skip I know it's easier said than done, but I feel like the standards-compliant thing would be something like:
|
Metadata resolution is the very first step and that component doesn't take into account what kind of builds are happening so it works perfectly in that case.
That is unrelated to what's happening here but you're correct that I should produce an error for build hooks modifying static metadata just like I do for metadata hooks. I'll work on that soon.
I understand why you might think that but the issue is that backends that offer a dynamic functionality must make a trade-off. There are three options:
Is that part of the standard? It appears to me that you either implement it or you don't: https://peps.python.org/pep-0517/#prepare-metadata-for-build-wheel |
Yeah makes sense. This more spun out of my idea to detect
I think you're right. Our implementation allows the backend to return
This seems like a reasonable outcome. The problem is that the "conditionally" is based on detecting the build front-end. Like, this wouldn't just be a problem for
Yeah, I understand. But it's still out of compliance with the standard, right? I'm not sure what to suggest other than that the standard needs to evolve in some way, but I'm not familiar enough with build backends to understand how. |
I suppose I'll use your implementation's hack and return |
It's possible that breaks other frontends though, I'm not sure >.< |
I have #2645 open for now. |
My read is that this won't work for def prepare_metadata_for_build_wheel(
metadata_directory, config_settings, _allow_fallback):
"""Invoke optional prepare_metadata_for_build_wheel
Implements a fallback by building a wheel if the hook isn't defined,
unless _allow_fallback is False in which case HookMissing is raised.
"""
backend = _build_backend()
try:
hook = backend.prepare_metadata_for_build_wheel
except AttributeError:
if not _allow_fallback:
raise HookMissing()
else:
return hook(metadata_directory, config_settings)
# fallback to build_wheel outside the try block to avoid exception chaining
# which can be confusing to users and is not relevant
whl_basename = backend.build_wheel(metadata_directory, config_settings)
return _get_wheel_metadata_from_wheel(whl_basename, metadata_directory,
config_settings) (Though I know you omit pip anyway altogether.) |
I would say just leave it for now, there's no need to make any urgent changes. |
I ended up special-casing |
…h dynamic dependencies (#2645) ## Summary Hatch allows for highly dynamic customization of metadata via hooks. In such cases, Hatch can't upload the PEP 517 contract, in that the metadata Hatch would return by `prepare_metadata_for_build_wheel` isn't guaranteed to match that of the built wheel. Hatch disables `prepare_metadata_for_build_wheel` entirely for pip. We'll instead disable it on our end when metadata is defined as "dynamic" in the pyproject.toml, which should allow us to leverage the hook in _most_ cases while still avoiding incorrect metadata for the remaining cases. Closes: #2130.
When you install packages using remote url and specify extras, the
--editable
version of extras are used, rather than the dependencies used in wheel. While I don't think it's very well specified which dependencies should be usedHere is the output of `uv pip install` output
Compare it with the equivalent
pip
result:Result of `pip install`
Note - all the
apache-airflow-providers-*
packages missing in case ofuv pip install
.The problem is likely that the installation uses directly
pyproject.toml
to install dependencies, however for such remote installation (and without--editable
install at that - but even if it would be specified,--editable
makes no sense for remote install) the dependencies should be the same as in packaged .whl file and it makes the installation ofuv
in this case non-compliant with PEP 517.A bit more context: Airlfow uses hatchling build backend, and utilzes PEP 517 compliant build_hooks (https://peps.python.org/pep-0517/#build-wheel) to modify the
--editable
extras intowheel
extras on the flight. So for example[celery]
requirement in pyproject.toml ( https://github.com/apache/airflow/blob/main/pyproject.toml#L641) is this:However the hatchling build hook of ours, when preparing
wheel
package, replaces this extra with:This is the way how we are dealing with our monorepo where
--editable
"extra" just installs dependencies of our providers, while the "wheel" extra install actual provider (and transitively dependencies of that provider).I believe that PEP-517 compliant way of installing a package from remote URL should actually build the wheel file first using the build backend the project has defined in
pyproject.toml
and only then install such a wheel file (this is exactly whatpip
does under the hood when installing package from remote url - treating it the same way as installind an sdist package (which the remote URL is equivalent of).The text was updated successfully, but these errors were encountered: