Skip to content

Failed to set up llama.cpp #546

Open
@Mubelotix

Description

@Mubelotix

Specs

  • Leon version: 1.0.0-beta.10+dev
  • OS (or browser) version: Ubuntu 24.10
  • Node.js version: v18.19.1
  • Complete "leon check" (or "npm run check") output:
```

leon@1.0.0-beta.10+dev check
tsx scripts/check.js

.: CHECKING :.
ℹ️ Leon version
✅ 1.0.0-beta.10+dev

ℹ️ Environment
✅ Free RAM: 10.09 GB | Total RAM: 15.5 GB
✅ {"type":"Linux","platform":"linux","arch":"x64","cpus":8,"release":"6.8.0-51-generic","osName":"Linux 6.8","distro":{"os":"linux","dist":"Ubuntu","codename":"noble","release":"24.04"}}

ℹ️ node --version
✅ v18.19.1

ℹ️ npm --version
✅ 9.2.0

✅ Node.js bridge version: 1.2.0
ℹ️ Executing a skill...
ℹ️ /tmp/leon/node_modules/tsx/dist/cli.mjs /tmp/leon/bridges/nodejs/dist/bin/leon-nodejs-bridge.js "/tmp/leon/scripts/assets/nodejs-bridge-intent-object.json"
🚨 Error: Command failed with exit code 1: /tmp/leon/node_modules/tsx/dist/cli.mjs /tmp/leon/bridges/nodejs/dist/bin/leon-nodejs-bridge.js "/tmp/leon/scripts/assets/nodejs-bridge-intent-object.json"

node:internal/process/esm_loader:40
internalBinding('errors').triggerUncaughtException(
^
Error [ERR_MODULE_NOT_FOUND]: Cannot find module '/tmp/leon/bridges/nodejs/dist/bin/leon-nodejs-bridge.js' imported from /tmp/leon/
at new NodeError (node:internal/errors:405:5)
at finalizeResolution (node:internal/modules/esm/resolve:327:11)
at moduleResolve (node:internal/modules/esm/resolve:980:10)
at defaultResolve (node:internal/modules/esm/resolve:1193:11)
at nextResolve (node:internal/modules/esm/hooks:864:28)
at y (file:///tmp/leon/node_modules/tsx/dist/esm/index.mjs?1735752550656:2:2067)
at j (file:///tmp/leon/node_modules/tsx/dist/esm/index.mjs?1735752550656:2:3314)
at async nextResolve (node:internal/modules/esm/hooks:864:22)
at async Hooks.resolve (node:internal/modules/esm/hooks:302:24)
at async handleMessage (node:internal/modules/esm/worker:196:18) {
url: 'file:///tmp/leon/bridges/nodejs/dist/bin/leon-nodejs-bridge.js',
code: 'ERR_MODULE_NOT_FOUND'
}

Node.js v18.19.1

✅ Python bridge version: 1.3.0
ℹ️ Executing a skill...
ℹ️ /tmp/leon/bridges/python/dist/linux-x86_64/leon-python-bridge "/tmp/leon/scripts/assets/python-bridge-intent-object.json"
🚨 Error: Command failed with exit code 127: /tmp/leon/bridges/python/dist/linux-x86_64/leon-python-bridge "/tmp/leon/scripts/assets/python-bridge-intent-object.json"
/bin/sh: 1: /tmp/leon/bridges/python/dist/linux-x86_64/leon-python-bridge: not found

✅ Python TCP server version: 1.1.0
ℹ️ Starting the Python TCP server...
ℹ️ /tmp/leon/tcp_server/dist/linux-x86_64/leon-tcp-server en
ℹ️ Python TCP server startup time: 8ms

ℹ️ Global resolvers NLP model state
🚨 Global resolvers NLP model not found or broken. Try to generate a new one: "npm run train"

ℹ️ Skills resolvers NLP model state
🚨 Skills resolvers NLP model not found or broken. Try to generate a new one: "npm run train"

ℹ️ Main NLP model state
🚨 Main NLP model not found or broken. Try to generate a new one: "npm run train"

ℹ️ Amazon Polly TTS
🚨 Cannot start the Python TCP server: /bin/sh: 1: /tmp/leon/tcp_server/dist/linux-x86_64/leon-tcp-server: not found

⚠️ Amazon Polly TTS is not yet configured

ℹ️ Google Cloud TTS/STT
⚠️ Google Cloud TTS/STT is not yet configured

ℹ️ Watson TTS
⚠️ Watson TTS is not yet configured

ℹ️ Offline TTS
⚠️ Cannot find bin/flite/flite. You can set up the offline TTS by running: "npm run setup:offline-tts"

ℹ️ Watson STT
⚠️ Watson STT is not yet configured

ℹ️ Offline STT
⚠️ Cannot find bin/coqui/huge-vocabulary.scorer. You can setup the offline STT by running: "npm run setup:offline-stt"

.: REPORT :.
ℹ️ Here is the diagnosis about your current setup
✅ Run
🚨 Run skills
🚨 Reply you by texting
🚨 Start the Python TCP server
⚠️ Amazon Polly text-to-speech
⚠️ Google Cloud text-to-speech
⚠️ Watson text-to-speech
⚠️ Offline text-to-speech
⚠️ Google Cloud speech-to-text
⚠️ Watson speech-to-text
⚠️ Offline speech-to-text

🚨 Please fix the errors above

.: REPORT URL :.
ℹ️ Sending report...
✅ Report URL: https://report.getleon.ai/raw/ijayemazaj

Expected Behavior

leon create birth should complete successfully.

Actual Behavior

It fails with error

ℹ️ Updating llama.cpp to b3543...
Usage: node-llama-cpp <command> [options]
https://node-llama-cpp.withcat.ai/cli

Commands:
  node-llama-cpp pull [urls..]         Download models from URLs
                                       https://node-llama-cpp.withcat.ai/cli/pull                                   [aliases: get]
  node-llama-cpp chat [modelPath]      Chat with a model
                                       https://node-llama-cpp.withcat.ai/cli/chat
  node-llama-cpp init [name]           Generate a new `node-llama-cpp` project from a template
                                       https://node-llama-cpp.withcat.ai/c
                                       li/init
  node-llama-cpp source <command>      Manage `llama.cpp` source code
                                       https://node-llama-cpp.withcat.ai/cli/source
  node-llama-cpp complete [modelPath]  Generate a completion for a given text
                                       https://node-llama-cpp.withcat.ai/cli/complete
  node-llama-cpp infill [modelPath]    Generate an infill completion for a given suffix and prefix texts
                                       https://node-llama-cpp.wi
                                       thcat.ai/cli/infill
  node-llama-cpp inspect <command>     Inspect the inner workings of `node-llama-cpp`
                                       https://node-llama-cpp.withcat.ai/cli/inspec
                                       t

Options:
  -h, --help     Show help                                                                                               [boolean]
  -v, --version  Show version number                                                                                     [boolean]

Unknown command: download
🚨 Failed to set up llama.cpp: Error: Command failed with exit code 1: npx --no node-llama-cpp download --release "b3543"
Error: Command failed with exit code 1: npm run postinstall

In my understanding, some library might have introduced a breaking change: renaming its command from download to pull.

How Do We Reproduce?

sudo npm install --global @leon-ai/cli
leon create birth

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugIndicates an unexpected problem or unintended behavior.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions