Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue After Installing AI Model #37

Open
plyght opened this issue Feb 25, 2024 · 5 comments
Open

Issue After Installing AI Model #37

plyght opened this issue Feb 25, 2024 · 5 comments

Comments

@plyght
Copy link

plyght commented Feb 25, 2024

i got this message after it said initializing AI model: " Error: Unable to load dynamic library: Unable to load dynamic server library: dlopen(/var/folders/r3/41cmfh2s0l52ndz3w971w7lc0000gn/T/ollama2523156174/metal/libext_server.dylib, 0x0006): tried: '/var/folders/r3/41cmfh2s0l52ndz3w971w7lc0000gn/T/ollama2523156174/metal/libext_server.dylib' (code signature in <5A0BFDE3-DAFF-3CDC-88E1-EB69C3900B6B> '/private/var/folders/r3/41cmfh2s0l52ndz3w971w7lc0000gn/T/ollama2523156174/metal/libext_server.dylib' not valid for use in process: mapping process and mapped file (non-platform) have different Team IDs), "

image
@BruceMacD
Copy link
Owner

Hi @plyght, sorry about this. Are you running Ollama locally as well? I think this could be related to a bug in Ollama where it doesn't clean up its files from the temp directory.

@ag2307
Copy link

ag2307 commented Mar 15, 2024

Hi, I wanted to try chatd and during the first run I also got the same error. I am not running any other LLM application locally. What could cause this issue?

Here's the service log (if required):

info: Ollama is not running: TypeError: fetch failed
info: Ollama is not installed on the system: TypeError [ERR_INVALID_ARG_TYPE]: The "path" argument must be of type string or an instance of Buffer or URL. Received undefined
info: waiting for ollama server...
info: fetch failed {"cause":{"address":"127.0.0.1","code":"ECONNREFUSED","errno":-61,"port":11434,"syscall":"connect"}}
info: ollama server is running
info: pulling model: mistral

@didikrm
Copy link

didikrm commented Mar 22, 2024

i got this message after it said initializing AI model: " Error: Unable to load dynamic library: Unable to load dynamic server library: dlopen(/var/folders/r3/41cmfh2s0l52ndz3w971w7lc0000gn/T/ollama2523156174/metal/libext_server.dylib, 0x0006): tried: '/var/folders/r3/41cmfh2s0l52ndz3w971w7lc0000gn/T/ollama2523156174/metal/libext_server.dylib' (code signature in <5A0BFDE3-DAFF-3CDC-88E1-EB69C3900B6B> '/private/var/folders/r3/41cmfh2s0l52ndz3w971w7lc0000gn/T/ollama2523156174/metal/libext_server.dylib' not valid for use in process: mapping process and mapped file (non-platform) have different Team IDs), "

I had the same issue. I reinstalled the app, went into MacOS terminal, typed ollama pull yourmodelname, then started chatd, and it worked.

@BenjaminStular
Copy link

Same issue (MacOS 14.1). Never installed Ollama, but have installed LMstudio and GPT4all (both probably/possibly using some of the Ollama's libraries). Both are closed, restarted, tried the Terminal - ollama pull yourmodelname, no change. Any ideas?

@BruceMacD
Copy link
Owner

I believe this should be fixed in the new v1.1.1 release.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants