Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

node-llama-cpp not compatible with "@langchain/core": "0.3.13" #367

Closed
1 of 5 tasks
PeterTucker opened this issue Oct 19, 2024 · 1 comment
Closed
1 of 5 tasks

node-llama-cpp not compatible with "@langchain/core": "0.3.13" #367

PeterTucker opened this issue Oct 19, 2024 · 1 comment
Labels
bug Something isn't working requires triage Requires triaging

Comments

@PeterTucker
Copy link

PeterTucker commented Oct 19, 2024

Issue description

TypeError: Cannot destructure property '_llama' of 'undefined' as it is undefined.

Expected Behavior

Load Model.

Actual Behavior

Throws error:

TypeError: Cannot destructure property '_llama' of 'undefined' as it is undefined.
    at new LlamaModel (file:///C:/Users/User/Project/langchain-test/node_modules/node-llama-cpp/dist/evaluator/LlamaModel/LlamaModel.js:42:144)
    at createLlamaModel (file:///C:/Users/User/Project/langchain-test/node_modules/@langchain/community/dist/utils/llama_cpp.js:13:12)
    at new LlamaCpp (file:///C:/Users/User/Project/langchain-test/node_modules/@langchain/community/dist/llms/llama_cpp.js:87:23)
    at file:///C:/Users/User/Project/langchain-test/src/server.js:15:17

Steps to reproduce

    import { LlamaCpp } from "@langchain/community/llms/llama_cpp";
    import fs from "fs";
    
    let llamaPath = "../project/data/llm-models/Hermes-2-Pro-Llama-3-8B-Q4_K_M.gguf"
    
    const question = "Where do Llamas come from?";
    
    
    if (fs.existsSync(llamaPath)) {
      console.log(`Model found at ${llamaPath}`);
    
      const model = new LlamaCpp({ modelPath: llamaPath});
    
      console.log(`You: ${question}`);
      const response = await model.invoke(question);
      console.log(`AI : ${response}`);
    } else {
      console.error(`Model not found at ${llamaPath}`);
    }

error:

TypeError: Cannot destructure property '_llama' of 'undefined' as it is undefined.
    at new LlamaModel (file:///C:/Users/User/Project/langchain-test/node_modules/node-llama-cpp/dist/evaluator/LlamaModel/LlamaModel.js:42:144)
    at createLlamaModel (file:///C:/Users/User/Project/langchain-test/node_modules/@langchain/community/dist/utils/llama_cpp.js:13:12)
    at new LlamaCpp (file:///C:/Users/User/Project/langchain-test/node_modules/@langchain/community/dist/llms/llama_cpp.js:87:23)
    at file:///C:/Users/User/Project/langchain-test/src/server.js:15:17

My Environment

command line: npx --yes node-llama-cpp inspect gpu
'nlc' is not recognized as an internal or external command,
operable program or batch file.

Using:
Windows 10. (though I get this error in WSL as well)
Node: v22.9.0
"node-llama-cpp": "^3.1.1"

Additional Context

No response

Relevant Features Used

  • Metal support
  • CUDA support
  • Vulkan support
  • Grammar
  • Function calling

Are you willing to resolve this issue by submitting a Pull Request?

Yes, I have the time, and I know how to start.

@PeterTucker PeterTucker added bug Something isn't working requires triage Requires triaging labels Oct 19, 2024
@PeterTucker
Copy link
Author

Answer from Langchain dev, "use version 2 not 3", packag.json:

    "dependencies": {
        "node-llama-cpp": "^2"
    },

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working requires triage Requires triaging
Projects
None yet
Development

No branches or pull requests

1 participant