Skip to content
This repository was archived by the owner on Dec 7, 2023. It is now read-only.
This repository was archived by the owner on Dec 7, 2023. It is now read-only.

Can not read model configuration file #2

@lbarasc

Description

@lbarasc

Under Windows :
java -jar chat-console.jar --model llama2 --system

the result is :

Error: chat.octet.exceptions.ServerException: Can not read model configuration file, please make sure it is valid

I have downloaded the model : https://huggingface.co/TheBloke/Llama-2-7B-GGUF/blob/main/llama-2-7b.Q6_K.gguf

Metadata

Metadata

Assignees

Labels

help wantedExtra attention is needed

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions