Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add recipe for libmetatensor-torch #26316

Merged
merged 8 commits into from
May 17, 2024

Conversation

Luthaf
Copy link
Contributor

@Luthaf Luthaf commented May 10, 2024

This is the continuation of conda-forge/metatensor-feedstock#2, now packaging the second native library. This one is pure C++, and links to libtorch.

Checklist

  • Title of this PR is meaningful: e.g. "Adding my_nifty_package", not "updated meta.yaml".
  • License file is packaged (see here for an example).
  • Source is from official source.
  • Package does not vendor other packages. (If a package uses the source of another package, they should be separate packages or the licenses of all packages need to be packaged).
  • If static libraries are linked in, the license of the static library is packaged.
  • Package does not ship static libraries. If static libraries are needed, follow CFEP-18.
  • Build number is 0.
  • A tarball (url) rather than a repo (e.g. git_url) is used in your recipe (see here for more details).
  • GitHub users listed in the maintainer section have posted a comment confirming they are willing to be listed there.
  • When in trouble, please check our knowledge base documentation before pinging a team.

@conda-forge-webservices
Copy link

Hi! This is the friendly automated conda-forge-linting service.

I wanted to let you know that I linted all conda-recipes in your PR (recipes/libmetatensor-torch) and found some lint.

Here's what I've got...

For recipes/libmetatensor-torch:

  • The following maintainers have not yet confirmed that they are willing to be listed here: PicoCentauri. Please ask them to comment on this PR if they are.

For recipes/libmetatensor-torch:

@Luthaf
Copy link
Contributor Author

Luthaf commented May 10, 2024

Just to check, the version of libtorch is pinned across the ecosystem, right? And any bump in the libtorch version will trigger a rebuild of all feedstocks depending on it?

@conda-forge-webservices
Copy link

Hi! This is the friendly automated conda-forge-linting service.

I wanted to let you know that I linted all conda-recipes in your PR (recipes/libmetatensor-torch) and found some lint.

Here's what I've got...

For recipes/libmetatensor-torch:

  • The following maintainers have not yet confirmed that they are willing to be listed here: PicoCentauri. Please ask them to comment on this PR if they are.

@conda-forge-webservices
Copy link

Hi! This is the friendly automated conda-forge-linting service.

I wanted to let you know that I linted all conda-recipes in your PR (recipes/libmetatensor-torch, recipes/python-metatensor) and found some lint.

Here's what I've got...

For recipes/libmetatensor-torch:

  • The following maintainers have not yet confirmed that they are willing to be listed here: PicoCentauri. Please ask them to comment on this PR if they are.

For recipes/python-metatensor:

  • The following maintainers have not yet confirmed that they are willing to be listed here: PicoCentauri. Please ask them to comment on this PR if they are.

@conda-forge-webservices
Copy link

Hi! This is the friendly automated conda-forge-linting service.

I wanted to let you know that I linted all conda-recipes in your PR (recipes/libmetatensor-torch) and found some lint.

Here's what I've got...

For recipes/libmetatensor-torch:

  • The following maintainers have not yet confirmed that they are willing to be listed here: PicoCentauri. Please ask them to comment on this PR if they are.

@Luthaf
Copy link
Contributor Author

Luthaf commented May 10, 2024

@conda-forge/help-c-cpp ready for review!

recipes/libmetatensor-torch/meta.yaml Show resolved Hide resolved
url: {{ url_base }}/metatensor-torch-v{{ version }}/metatensor-torch-cxx-{{ version }}.tar.gz
sha256: 904cf858d8f98b67b948e8a453d8a6da56111e022050d6c8c3d32a9a2cc83464

build:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shared libraries also need a run_exports.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's not very clear what run_exports does. Does it only add a run dependency to all packages that built against the library and had it as a host dependency? Or is there something else I need to be aware of here?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

https://docs.conda.io/projects/conda-build/en/latest/resources/define-metadata.html#export-runtime-requirements

run_exports export runtime requirements to downstream packages which dynamically link to your package. It's a statement which says "hey, if you link to my package, then this is the version that you will need at runtime."

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! I've added run_exports to libmetatensor and removed it from the run requirements here.

I'd guess libtorch also has a run_exports, but is there a way to express "build against the cpu variant but run with any variant, cpu or cuda"? For now I do two separate host & run requirements.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd guess libtorch also has a run_exports

Yes.

Looks like the run_exports for libtorch are variant agnostic, so if you build against any variant of libtorch you can get any variant at runtime. It looks like this package doesn't link against libtorch_cuda, so you don't need to worry about contraining the libtorch build variant.

@PicoCentauri
Copy link
Contributor

Thanks for the work @Luthaf. Of course I am willing to help maintaining this recipe.

@conda-forge-webservices
Copy link

Hi! This is the friendly automated conda-forge-linting service.

I just wanted to let you know that I linted all conda-recipes in your PR (recipes/libmetatensor-torch) and found it was in an excellent condition.

@Luthaf
Copy link
Contributor Author

Luthaf commented May 16, 2024

@conda-forge/help-c-cpp ready for another round of review!

@carterbox
Copy link
Member

@Luthaf
Copy link
Contributor Author

Luthaf commented May 17, 2024

The logs show that the build system is also looking for zlib? Is that a missing host dependency?

We are not trying to load ZLIB ourself. Digging around the CMake files, this is what seems to happen:

We load torch with find_package(Torch), which then does find_package(Caffe2)

https://github.com/pytorch/pytorch/blob/6c503f1dbbf9ef1bf99f19f0048c287f419df600/cmake/TorchConfig.cmake.in#L68

Which then does find_package(Protobuf):

https://github.com/pytorch/pytorch/blob/4ed93d6e0c5deb543ba5a3bd103728f00d39b1a6/cmake/Caffe2Config.cmake.in#L48
https://github.com/pytorch/pytorch/blob/6c503f1dbbf9ef1bf99f19f0048c287f419df600/cmake/public/protobuf.cmake#L4

which finally tries to find ZLIB:

https://github.com/protocolbuffers/protobuf/blob/2f3242c576504459621fd7d78bbccf39bfaa49c5/cmake/protobuf-config.cmake.in#L5
https://github.com/protocolbuffers/protobuf/blob/2f3242c576504459621fd7d78bbccf39bfaa49c5/CMakeLists.txt#L172

@carterbox carterbox merged commit 341bf32 into conda-forge:main May 17, 2024
5 checks passed
@Luthaf Luthaf deleted the libmetatensor-torch branch May 17, 2024 15:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

Successfully merging this pull request may close these issues.

None yet

4 participants