Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

python3Packages.llama-cpp-python: 0.3.6 -> 0.3.8 #391734

Merged
merged 1 commit into from
Mar 22, 2025
Merged

Conversation

kirillrdy
Copy link
Member

@kirillrdy kirillrdy commented Mar 21, 2025

diff: abetlen/llama-cpp-python@v0.3.6...v0.3.8

Things done

  • Built on platform(s)
    • x86_64-linux
    • aarch64-linux
    • x86_64-darwin
    • aarch64-darwin
  • For non-Linux: Is sandboxing enabled in nix.conf? (See Nix manual)
    • sandbox = relaxed
    • sandbox = true
  • Tested, as applicable:
  • Tested compilation of all packages that depend on this change using nix-shell -p nixpkgs-review --run "nixpkgs-review rev HEAD". Note: all changes have to be committed, also see nixpkgs-review usage
  • Tested basic functionality of all binary files (usually in ./result/bin/)
  • 25.05 Release Notes (or backporting 24.11 and 25.05 Release notes)
    • (Package updates) Added a release notes entry if the change is major or breaking
    • (Module updates) Added a release notes entry if the change is significant
    • (Module addition) Added a release notes entry if adding a new NixOS module
  • Fits CONTRIBUTING.md.

Add a 👍 reaction to pull requests you find important.

@booxter
Copy link
Contributor

booxter commented Mar 21, 2025

nixpkgs-review result

Generated using nixpkgs-review.

Command: nixpkgs-review pr 391734


aarch64-linux

✅ 8 packages built:
  • python312Packages.llama-cpp-python
  • python312Packages.llama-cpp-python.dist
  • python312Packages.llm-gguf
  • python312Packages.llm-gguf.dist
  • python313Packages.llama-cpp-python
  • python313Packages.llama-cpp-python.dist
  • python313Packages.llm-gguf
  • python313Packages.llm-gguf.dist

x86_64-darwin

✅ 8 packages built:
  • python312Packages.llama-cpp-python
  • python312Packages.llama-cpp-python.dist
  • python312Packages.llm-gguf
  • python312Packages.llm-gguf.dist
  • python313Packages.llama-cpp-python
  • python313Packages.llama-cpp-python.dist
  • python313Packages.llm-gguf
  • python313Packages.llm-gguf.dist

aarch64-darwin

✅ 8 packages built:
  • python312Packages.llama-cpp-python
  • python312Packages.llama-cpp-python.dist
  • python312Packages.llm-gguf
  • python312Packages.llm-gguf.dist
  • python313Packages.llama-cpp-python
  • python313Packages.llama-cpp-python.dist
  • python313Packages.llm-gguf
  • python313Packages.llm-gguf.dist

extraPrefix = "vendor/llama.cpp/";
hash = "sha256-71+Lpg9z5KPlaQTX9D85KS2LXFWLQNJJ18TJyyq3/pU=";
})
];
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we have to have an empty list for patches here?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

removed

@@ -85,6 +78,7 @@ buildPythonPackage rec {
nativeBuildInputs = [
cmake
ninja
git
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of this, consider defining the variable that circumvents git revision extraction:

diff --git a/pkgs/development/python-modules/llama-cpp-python/default.nix b/pkgs/development/python-modules/llama-cpp-python/default.nix
index 115c533479f7..b7868efdcec0 100644
--- a/pkgs/development/python-modules/llama-cpp-python/default.nix
+++ b/pkgs/development/python-modules/llama-cpp-python/default.nix
@@ -8,7 +8,6 @@
 
   # nativeBuildInputs
   cmake,
-  git,
   ninja,
 
   # build-system
@@ -65,7 +64,7 @@ buildPythonPackage rec {
     # -mcpu, breaking linux build as follows:
     #
     # cc1: error: unknown value ‘native+nodotprod+noi8mm+nosve’ for ‘-mcpu’
-    [ "-DGGML_NATIVE=off" ]
+    [ "-DGGML_NATIVE=off" "-DGGML_BUILD_NUMBER=1" ]
     ++ lib.optionals cudaSupport [
       "-DGGML_CUDA=on"
       "-DCUDAToolkit_ROOT=${lib.getDev cudaPackages.cuda_nvcc}"
@@ -78,7 +77,6 @@ buildPythonPackage rec {
   nativeBuildInputs = [
     cmake
     ninja
-    git
   ];
 
   build-system = [

The final cmake version file suggests that the build system fails to extract the revision from git anyway, since we don't checkout submodules deeply. See:

$ grep PACKAGE_VERSION result/lib/python3.12/site-packages/lib/cmake/ggml/ggml-version.cmake | grep '^set'
set(PACKAGE_VERSION "0.0.")

If you think we should have the version populated properly, then we should do:

diff --git a/pkgs/development/python-modules/llama-cpp-python/default.nix b/pkgs/development/python-modules/llama-cpp-python/default.nix
index 115c533479f7..03f136e8f0e7 100644
--- a/pkgs/development/python-modules/llama-cpp-python/default.nix
+++ b/pkgs/development/python-modules/llama-cpp-python/default.nix
@@ -48,8 +48,9 @@ buildPythonPackage rec {
     owner = "abetlen";
     repo = "llama-cpp-python";
     tag = "v${version}";
-    hash = "sha256-F1E1c2S1iIL3HX/Sot/uIIrOWvfPU1dCrHx14A1Jn9E=";
+    hash = "sha256-KLlm0rHOqEB90XXxJj+tunO6Nxyy/WUoTbhnkfmalAw=";
     fetchSubmodules = true;
+    deepClone = true;
   };
   # src = /home/gaetan/llama-cpp-python;
 

With that, the version is correctly populated:

$ grep PACKAGE_VERSION result/lib/python3.12/site-packages/lib/cmake/ggml/ggml-version.cmake | grep '^set'
set(PACKAGE_VERSION "0.0.4875")

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

[ "-DGGML_NATIVE=off" ]
[
"-DGGML_NATIVE=off"
"-DGGML_BUILD_NUMBER=1"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you deepClone then the version will actually be extracted from the (submodule) git revision; and then you don't need to override it with the variable here. These two are mutually exclusive.

@kirillrdy kirillrdy force-pushed the llama-cpp-python branch 2 times, most recently from 1432495 to 1507a8d Compare March 21, 2025 20:33
@booxter
Copy link
Contributor

booxter commented Mar 21, 2025

nixpkgs-review result

Generated using nixpkgs-review.

Command: nixpkgs-review pr 391734


aarch64-linux

✅ 8 packages built:
  • python312Packages.llama-cpp-python
  • python312Packages.llama-cpp-python.dist
  • python312Packages.llm-gguf
  • python312Packages.llm-gguf.dist
  • python313Packages.llama-cpp-python
  • python313Packages.llama-cpp-python.dist
  • python313Packages.llm-gguf
  • python313Packages.llm-gguf.dist

x86_64-darwin

✅ 8 packages built:
  • python312Packages.llama-cpp-python
  • python312Packages.llama-cpp-python.dist
  • python312Packages.llm-gguf
  • python312Packages.llm-gguf.dist
  • python313Packages.llama-cpp-python
  • python313Packages.llama-cpp-python.dist
  • python313Packages.llm-gguf
  • python313Packages.llm-gguf.dist

aarch64-darwin

✅ 8 packages built:
  • python312Packages.llama-cpp-python
  • python312Packages.llama-cpp-python.dist
  • python312Packages.llm-gguf
  • python312Packages.llm-gguf.dist
  • python313Packages.llama-cpp-python
  • python313Packages.llama-cpp-python.dist
  • python313Packages.llm-gguf
  • python313Packages.llm-gguf.dist

Copy link
Contributor

@booxter booxter left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks.

@wegank wegank added 12.approved-by: package-maintainer This PR was reviewed and approved by a maintainer listed in the package 12.approvals: 1 This PR was reviewed and approved by one reputable person labels Mar 21, 2025
Copy link
Contributor

@booxter booxter left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried to run the same on a different machine and I'm afraid that deepClone won't work because there's a bug in its implementation that makes it not deterministic: #100498 :(

We will probably have to go with overriding the variable until this is fixed in the fetcher.

@kirillrdy kirillrdy merged commit 2e384a0 into master Mar 22, 2025
26 of 27 checks passed
@kirillrdy kirillrdy deleted the llama-cpp-python branch March 22, 2025 03:53
@booxter
Copy link
Contributor

booxter commented Mar 22, 2025

When abetlen/llama-cpp-python#1979 is fixed, we may be able to remove the fake version setting and rely on the version file included in the package tarball.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
6.topic: python 10.rebuild-darwin: 1-10 10.rebuild-linux: 1-10 12.approvals: 1 This PR was reviewed and approved by one reputable person 12.approved-by: package-maintainer This PR was reviewed and approved by a maintainer listed in the package
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants