Skip to content

Error: loadModel() failed #1

@behelit

Description

@behelit

Hi,
I downloaded 'gemma-3-270-m-q4_k_m.gguf' from hugging face and attempted to bundle it via project assets and it correctly pics it up and loads it but fails to load the file.

The error reported is:
failed to load model from /data/user/0/com.jegly.offlineLLM/files/models/gemma-3-270-m-q4_k_m.gguf

The same error occurs when loaded via the file import.

Which occurs in llama_model_load_from_file_impl

I can see log references in this c code but I'm not sure where it logs to? as it isn't output via logcat.

So I guess I'd like to know:

  1. How could I view the LLAMA_LOG_ERROR outputs?
  2. What could cause the loadModel() to fail?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions