Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[02]: Solver report #7

Open
SomeoneSerge opened this issue Jul 3, 2024 · 3 comments
Open

[02]: Solver report #7

SomeoneSerge opened this issue Jul 3, 2024 · 3 comments

Comments

@SomeoneSerge
Copy link
Collaborator

The return value of the solver part of evanix should contain enough information to generate a report saying (0) how many derivations in total will be built (in future versions also how many compute hours we expect to expend), (1) which "targets" will be built, (2) which "targets" will remain unsatisfied. Optional features: print how much extra budget is required to satisfy one more target. Both --dry-run and the normal mode should be able to produce a report like that. The return value of the solver should probably be an immutable structure, and the "solver" should be a simple enough function to be used in a unit test

@SomeoneSerge SomeoneSerge mentioned this issue Jul 8, 2024
9 tasks
@sinanmohd
Copy link
Owner

3529fff

@SomeoneSerge
Copy link
Collaborator Author

❯ time nix run github:sinanmohd/evanix -- --system x86_64-linux --flake .#packages --dry-run --max-build 2 --solver-report
ℹ️ cost:  1, conformity: 0.00 <-> llama-cpp-cuda-0.0.0 -> /nix/store/8zfrys9ph8lvsk2zdvkflm7x8j5l7mky-llama-cpp-cuda-0.0.0.drv
✅  cost:  1, conformity: 0.00 <-> llama-cpp-cuda-0.0.0 -> /nix/store/8zfrys9ph8lvsk2zdvkflm7x8j5l7mky-llama-cpp-cuda-0.0.0.drv
🛠️ nix-build --out-link result-x86_64-linux.cuda /nix/store/8zfrys9ph8lvsk2zdvkflm7x8j5l7mky-llama-cpp-cuda-0.0.0.drv
ℹ️ cost:  1,dconformlty: 0.00 <-> llama-cpp-blas-0.0.0 -> /nix/store/di2nvk2qfvkqli4jbx964cdmcp1406mm-llamdacpp-blas-0.0.0.drv
✅ cost:  1, conformity: 0.00 <-> llama-cpp-blas-0.0.0 -> /nix/store/di2nvkzqfkkqmi4jbx964cdmcp1406mm-llama-cpp-blas-0.0.0.drv
🛠️ nix-build --out-link result-x86_64-linux.default /nix/store/di2nvkzqfkkqmi4jbx964cdmcp1406mm-llama-cpp-blas-0.0.0.drv
❌ cost:  1 >  0 <-> llama-cpp-blas-mpi-0.0.0 -> /nix/store/04gcqsa1z0d22nv9irk2chliqmwykhb5-llama-cpp-blas-mpi-0.0.0.drv
❌ cost:  1 >  0 <-> llama-cpp-rocm-0.0.0 -> /nix/store/g7wl7g3sh809wk8kkx28j3bp5nks83in-llama-cpp-rocm-0.0.0.drv
❌ cost:  1 >  0 <-> llama-cpp-vulkan-0.0.0 -> /nix/store/rfxvn6zksgq1mkvm756hy6xxsiz7lsm0-llama-cpp-vulkan-0.0.0.drv
❌ cost:  3 >  0 <-> llama-cpp-x86_64-w64-mingw32-0.0.0 -> /nix/store/cjzqlx0jk97lqmnvsjxymfgjxbaxzg67-llama-cpp-x86_64-w64-mingw32-0.0.0.drv

Awesome.

Nits:

  • Not sure why the same llama-cpp-cuda is printed 4 times
  • No idea what "conformity" is
  • Not sure 3 > 0 is useful
  • Not sure what's the difference between ℹ️ and ✅
  • AFAIU you kept adding into the same mutable struct jobs:
    bool reported;
    I'll just remind you that if you wanted to keep the solver data separate and the dag immutable, you absolutely could, e.g. (but not limited to) by numbering the nodes and referring to them by their ids rather than by pointers

@sinanmohd
Copy link
Owner

i've removed ✅ and attr name duplication in commit
456460c

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants