You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
TypeError: Cannot destructure property '_llama' of 'undefined' as it is undefined.
Expected Behavior
Load Model.
Actual Behavior
Throws error:
TypeError: Cannot destructure property '_llama' of 'undefined' as it is undefined.
at new LlamaModel (file:///C:/Users/User/Project/langchain-test/node_modules/node-llama-cpp/dist/evaluator/LlamaModel/LlamaModel.js:42:144)
at createLlamaModel (file:///C:/Users/User/Project/langchain-test/node_modules/@langchain/community/dist/utils/llama_cpp.js:13:12)
at new LlamaCpp (file:///C:/Users/User/Project/langchain-test/node_modules/@langchain/community/dist/llms/llama_cpp.js:87:23)
at file:///C:/Users/User/Project/langchain-test/src/server.js:15:17
Steps to reproduce
import { LlamaCpp } from "@langchain/community/llms/llama_cpp";
import fs from "fs";
let llamaPath = "../project/data/llm-models/Hermes-2-Pro-Llama-3-8B-Q4_K_M.gguf"
const question = "Where do Llamas come from?";
if (fs.existsSync(llamaPath)) {
console.log(`Model found at ${llamaPath}`);
const model = new LlamaCpp({ modelPath: llamaPath});
console.log(`You: ${question}`);
const response = await model.invoke(question);
console.log(`AI : ${response}`);
} else {
console.error(`Model not found at ${llamaPath}`);
}
error:
TypeError: Cannot destructure property '_llama' of 'undefined' as it is undefined.
at new LlamaModel (file:///C:/Users/User/Project/langchain-test/node_modules/node-llama-cpp/dist/evaluator/LlamaModel/LlamaModel.js:42:144)
at createLlamaModel (file:///C:/Users/User/Project/langchain-test/node_modules/@langchain/community/dist/utils/llama_cpp.js:13:12)
at new LlamaCpp (file:///C:/Users/User/Project/langchain-test/node_modules/@langchain/community/dist/llms/llama_cpp.js:87:23)
at file:///C:/Users/User/Project/langchain-test/src/server.js:15:17
My Environment
command line: npx --yes node-llama-cpp inspect gpu
'nlc' is not recognized as an internal or external command,
operable program or batch file.
Using:
Windows 10. (though I get this error in WSL as well)
Node: v22.9.0
"node-llama-cpp": "^3.1.1"
Additional Context
No response
Relevant Features Used
Metal support
CUDA support
Vulkan support
Grammar
Function calling
Are you willing to resolve this issue by submitting a Pull Request?
Yes, I have the time, and I know how to start.
The text was updated successfully, but these errors were encountered:
Issue description
TypeError: Cannot destructure property '_llama' of 'undefined' as it is undefined.
Expected Behavior
Load Model.
Actual Behavior
Throws error:
Steps to reproduce
error:
My Environment
command line:
npx --yes node-llama-cpp inspect gpu
'nlc' is not recognized as an internal or external command,
operable program or batch file.
Using:
Windows 10. (though I get this error in WSL as well)
Node: v22.9.0
"node-llama-cpp": "^3.1.1"
Additional Context
No response
Relevant Features Used
Are you willing to resolve this issue by submitting a Pull Request?
Yes, I have the time, and I know how to start.
The text was updated successfully, but these errors were encountered: