-
Notifications
You must be signed in to change notification settings - Fork 334
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
System.Runtime.InteropServices.MarshalDirectiveException: 'Method's type signature is not PInvoke compatible.' #484
Comments
2024-Feb-03: Variations leading to the same error:
System: |
I think that means that one of the native llama.cpp methods which LLamaSharp is not quite compatible with the .NET Framework pinvoke system. That can probably be worked around, if either of you can track down exactly which method this is I can take a look at it. To track it down just stick down a breakpoint on the LoadFromFile method and step into it, eventually you'll get to a method which crashes when called. That method is the one that'll need to be modified to work around this. |
Thank you for responding. Will do. Could it be that certain processors handle that method fine, while others don't?.. This _could explain why _some users only signal this issue, coulda, woulda, shoulda. |
I think anyone running LLamaSharp on .NET Framework would have this issue. But framework is pretty rare to use these days. |
I found the culprit, so it seems; by way of F11 Step Into. The way to crash'n burn was: 1 of 2) in method:
2 of 2) something happened in the line:
in method (well, here's the whole litany):
|
So you think it's probably the |
Yes, looks like it, and I stop at this conclusion, since I cannot-I do not know how to change anything. |
Thanks for putting in the effort to narrow it down :) |
No problem :) |
Hi there,
|
I've been trying with 0.10 in the hope that it works; alas, it is not the case. VS2022Community, version 17.9.1. 1 of 2) tried with Target: The result is the same for all versions above: The code is simple, made even more simple with a single parameter, yet the problem is the same.
this^ hails to ModelParams.cs:
which in turn calls TensorSplitsCollection.cs:
which FWIW ends the .exe process with code -1073741795. 2 of 2) tried with Target: The code:
-> 1 first pass (1st parameter, I guess, "ContextSize"): this^ hails to ModelParams.cs:
which calls TensorSplitsCollection.cs:
-> OK -> goes to Nullable.cs, in:
-> OK -> exits the declaration above of var parameters = new ModelParams(modelPath) {... ...}
which ofc is in
and from here to:
where it crashes on the line: with the PInvoke error message in the title here. Looks like the exit point depends on the type of Application and the version of .NET. |
Any issue only gets attention because a motivated developer contributes their spare time and expertise to work on it! No one is being paid to work on LLamaSharp. As I understood it, this issue was about .NET Framework. However, from what you've said in your most recent message it looks like it's also not working for you with .NET 6/7/8. Is that correct? |
Yes indeed - it does not work with any of these: 2.0, 4.8, 6.0, 7.0, 8.0. The fact that others do not encounter this problem makes it even more frustrating; stopping at step 1 - when loading a model, before anything else, yet not everywhere. |
Are you absolutely certain that you correctly configured the dotnet version? I've had a look through the docs and I can't see how this error could happen for you on NET8, but not for others. |
I think so, given that I've followed the instructions for installing the 0.10.0 packages of llamasharp and backend.cpu - anyway, I have no idea how/what to do otherwise. (interesting how in a Console App on .NET 8.0 it doesn't even get to the PInvoke error like it does in a WinForms App on lower .NETs) Models tested on:
These files were placed in a directory (D:\llms\filename.gguf"), The entire Console App / .NET 8.0 is the example below taken from: The program starts, displays a log:
then it crashes in the Tensor area when entering the block of parameters,
->
|
OK now... It turns out that it was a question of RAM, if not of processor; I suspect that the RAM is the reason:
so: I've tried the Apps and the code above on these low-end systems:
The next one to use/test is an (i5 + 12 GB RAM), but I presume it's clear... Not enough RAM to load a model in = App crash. As such, the fact gets explained that not many users experience this, since such low-end systems are indeed lo-tech nowadays. I guess some technical writing about this simple situation is in order... like, basic requirements... my apologies if such info does exist. I for myself got mindlessly caught in this all too visible trap, and have dragged there you too, so please accept my sincere apologies! I cannot give you back the time spent on this. Time is the fire in which we burn... So this should be closed and possibly documented somewhere (including within the code itself), sort of a table with Model | RAM needed... and the error X being caused by this and not by that... until cleared too... cause the errors are still here... or a message, Thine System doth not suffer this for it doth not possess enough memory, so pay heed to thou shalt not! try it at thine abode/lair. Thank you! for your Time and interest in this issue.
|
Aha, good job narrowing that down! |
I use LLamaSharp 0.9.1,and can not run well,it error occured
var model = LLamaWeights.LoadFromFile(parameters); results in an error at runtime:
System.Runtime.InteropServices.MarshalDirectiveException: 'Method's type signature is not PInvoke compatible.'
I found issue #14 problem is like me,but no solution.my environment is:
vs2019
netframework4.8.0
LLamaSharp 0.9.1
LLamaSharp.Backend.Cpu ver0.9.1)
The text was updated successfully, but these errors were encountered: