-
-
Notifications
You must be signed in to change notification settings - Fork 15.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
python3Packages.llama-cpp-python: 0.3.6 -> 0.3.8 #391734
Conversation
|
extraPrefix = "vendor/llama.cpp/"; | ||
hash = "sha256-71+Lpg9z5KPlaQTX9D85KS2LXFWLQNJJ18TJyyq3/pU="; | ||
}) | ||
]; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we have to have an empty list for patches
here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
removed
@@ -85,6 +78,7 @@ buildPythonPackage rec { | |||
nativeBuildInputs = [ | |||
cmake | |||
ninja | |||
git |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of this, consider defining the variable that circumvents git revision extraction:
diff --git a/pkgs/development/python-modules/llama-cpp-python/default.nix b/pkgs/development/python-modules/llama-cpp-python/default.nix
index 115c533479f7..b7868efdcec0 100644
--- a/pkgs/development/python-modules/llama-cpp-python/default.nix
+++ b/pkgs/development/python-modules/llama-cpp-python/default.nix
@@ -8,7 +8,6 @@
# nativeBuildInputs
cmake,
- git,
ninja,
# build-system
@@ -65,7 +64,7 @@ buildPythonPackage rec {
# -mcpu, breaking linux build as follows:
#
# cc1: error: unknown value ‘native+nodotprod+noi8mm+nosve’ for ‘-mcpu’
- [ "-DGGML_NATIVE=off" ]
+ [ "-DGGML_NATIVE=off" "-DGGML_BUILD_NUMBER=1" ]
++ lib.optionals cudaSupport [
"-DGGML_CUDA=on"
"-DCUDAToolkit_ROOT=${lib.getDev cudaPackages.cuda_nvcc}"
@@ -78,7 +77,6 @@ buildPythonPackage rec {
nativeBuildInputs = [
cmake
ninja
- git
];
build-system = [
The final cmake version file suggests that the build system fails to extract the revision from git anyway, since we don't checkout submodules deeply. See:
$ grep PACKAGE_VERSION result/lib/python3.12/site-packages/lib/cmake/ggml/ggml-version.cmake | grep '^set'
set(PACKAGE_VERSION "0.0.")
If you think we should have the version populated properly, then we should do:
diff --git a/pkgs/development/python-modules/llama-cpp-python/default.nix b/pkgs/development/python-modules/llama-cpp-python/default.nix
index 115c533479f7..03f136e8f0e7 100644
--- a/pkgs/development/python-modules/llama-cpp-python/default.nix
+++ b/pkgs/development/python-modules/llama-cpp-python/default.nix
@@ -48,8 +48,9 @@ buildPythonPackage rec {
owner = "abetlen";
repo = "llama-cpp-python";
tag = "v${version}";
- hash = "sha256-F1E1c2S1iIL3HX/Sot/uIIrOWvfPU1dCrHx14A1Jn9E=";
+ hash = "sha256-KLlm0rHOqEB90XXxJj+tunO6Nxyy/WUoTbhnkfmalAw=";
fetchSubmodules = true;
+ deepClone = true;
};
# src = /home/gaetan/llama-cpp-python;
With that, the version is correctly populated:
$ grep PACKAGE_VERSION result/lib/python3.12/site-packages/lib/cmake/ggml/ggml-version.cmake | grep '^set'
set(PACKAGE_VERSION "0.0.4875")
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
b381e78
to
c518609
Compare
[ "-DGGML_NATIVE=off" ] | ||
[ | ||
"-DGGML_NATIVE=off" | ||
"-DGGML_BUILD_NUMBER=1" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you deepClone
then the version will actually be extracted from the (submodule) git revision; and then you don't need to override it with the variable here. These two are mutually exclusive.
1432495
to
1507a8d
Compare
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tried to run the same on a different machine and I'm afraid that deepClone
won't work because there's a bug in its implementation that makes it not deterministic: #100498 :(
We will probably have to go with overriding the variable until this is fixed in the fetcher.
1507a8d
to
7e69ba3
Compare
When abetlen/llama-cpp-python#1979 is fixed, we may be able to remove the fake version setting and rely on the version file included in the package tarball. |
diff: abetlen/llama-cpp-python@v0.3.6...v0.3.8
Things done
nix.conf
? (See Nix manual)sandbox = relaxed
sandbox = true
nix-shell -p nixpkgs-review --run "nixpkgs-review rev HEAD"
. Note: all changes have to be committed, also see nixpkgs-review usage./result/bin/
)Add a 👍 reaction to pull requests you find important.