-
Notifications
You must be signed in to change notification settings - Fork 471
Open
Description
Hello,
First, of all, thank you for providing such useful tool.
I am using LlamaSharp in a unity project and get stuck at the first call of the library at run-time (editor or build).
ModelParams modelParams = new ModelParams(modelPath);
RuntimeError: The native library cannot be correctly loaded. It could be one of the following reasons:
1. No LLamaSharp backend was installed. Please search LLamaSharp.Backend and install one of them.
2. You are using a device with only CPU but installed cuda backend. Please install cpu backend instead.
3. One of the dependency of the native library is missed. Please use `ldd` on linux, `dumpbin` on windows and `otool`to check if all the dependency of the native library is satisfied. Generally you could find the libraries under your output folder.
4. Try to compile llama.cpp yourself to generate a libllama library, then use `LLama.Native.NativeLibraryConfig.WithLibrary` to specify it at the very beginning of your code. For more information about compilation, please refer to LLamaSharp repo on github.
LLama.Native.NativeApi..cctor () (at <37add02f2b434c93b00ec36010865f62>:0)
Rethrow as TypeInitializationException: The type initializer for 'LLama.Native.NativeApi' threw an exception.
LLama.Abstractions.TensorSplitsCollection..ctor () (at <37add02f2b434c93b00ec36010865f62>:0)
LLama.Common.ModelParams..ctor (System.String modelPath) (at <37add02f2b434c93b00ec36010865f62>:0)
AIChatBot.Test () (at Assets/0 - Scripts/AI/AIChatBot.cs:22)
Environment:
- Mac OS X 15.6, Silicon M1
- Unity 6.0.58
- NET Standard 2.1
- Installed Llama.Csharp, Llama.Csharp.backend.CPu, Microsoft.KernelMemory.Kernel through Nuget for Unity.
All dependencies are well included in the package folder. - The backend native runtime is the osx-arm64
- v 0.25.0
Not sure how to move forward...
Metadata
Metadata
Assignees
Labels
No labels