Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: Jan does not support cyrillic (maybe any non-latin symbols) in the path to models. #1713

Closed
Glashkoff opened this issue Jan 22, 2024 · 4 comments
Labels

Comments

@Glashkoff
Copy link

Glashkoff commented Jan 22, 2024

Describe the bug
Nitro fails to load the model in Windows if there is cyrillic in the path to the folder with models. This can happen when the user's name is specified in a national language.

Steps to reproduce
Steps to reproduce the behavior:

  1. Create Windows user profile named, for example, 'Абв'
  2. Install, run Jan and download any model.
  3. Trying то chat with model.
  4. You can see empty answer.
  5. Jan make error in log like [NITRO]::Error: error loading model: failed to open C:\Users\Абв\jan\models\tinyllama-1.1b\tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf: No such file or directory

Expected behavior
Jan will allow working with local AI in chat mode.

Environment details

  • Operating System: Windows 11 Pro x64 23H2 (build 22631.3007)
  • Jan Version: 0.4.4
  • Processor: AMD Ryzen 5 5600
  • RAM: 32GB
  • Any additional relevant hardware specifics: RTX 3060

Logs
/jan/logs/app.log:

`2024-01-22T23:22:36.794Z [NITRO]::CPU informations - 12
2024-01-22T23:22:36.795Z [NITRO]::Debug: Request to kill Nitro
2024-01-22T23:22:39.916Z [NITRO]::Debug: Nitro process is terminated
2024-01-22T23:22:40.432Z [NITRO]::Debug: Spawning Nitro subprocess...
2024-01-22T23:22:41.086Z [NITRO]::Debug: �[93m �[94m �[93m �[94m �[93m �[94m �[93m_�[94m_�[93m_�[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m_�[94m_�[93m_�[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m_�[94m_�[93m_�[94m �[93m �[94m �[93m �[94m �[0m
�[93m �[94m �[93m �[94m �[93m �[94m/�[93m_�[94m_�[93m/�[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m_�[93m_�[94m_�[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m_�[93m_�[94m_�[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m/�[94m �[93m �[94m/�[93m\�[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m/�[94m �[93m �[94m/�[93m\�[94m �[93m �[94m �[93m �[0m
�[93m �[94m �[93m �[94m �[93m �[94m\�[93m �[94m �[93m\�[94m:�[93m\�[94m �[93m �[94m �[93m �[94m �[93m �[94m/�[93m �[94m �[93m/�[94m\�[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m/�[93m �[94m �[93m/�[94m\�[93m �[94m �[93m �[94m �[93m �[94m �[93m/�[94m �[93m �[94m/�[93m:�[94m:�[93m\�[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m/�[94m �[93m �[94m/�[93m:�[94m:�[93m\�[94m �[93m �[94m �[0m
�[93m �[94m �[93m �[94m �[93m �[94m �[93m\�[94m �[93m �[94m\�[93m:�[94m\�[93m �[94m �[93m �[94m �[93m/�[94m �[93m �[94m/�[93m:�[94m/�[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m/�[94m �[93m �[94m/�[93m:�[94m/�[93m �[94m �[93m �[94m �[93m �[94m/�[93m �[94m �[93m/�[94m:�[93m/�[94m\�[93m:�[94m\�[93m �[94m �[93m �[94m �[93m �[94m/�[93m �[94m �[93m/�[94m:�[93m/�[94m\�[93m:�[94m\�[93m �[94m �[0m
�[93m �[94m �[93m_�[94m_�[93m_�[94m_�[93m_�[94m\�[93m_�[94m_�[93m\�[94m:�[93m\�[94m �[93m �[94m/�[93m_�[94m_�[93m/�[94m:�[93m:�[94m\�[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m/�[93m �[94m �[93m/�[94m:�[93m/�[94m �[93m �[94m �[93m �[94m �[93m/�[94m �[93m �[94m/�[93m:�[94m/�[93m �[94m �[93m\�[94m:�[93m\�[94m �[93m �[94m �[93m/�[94m �[93m �[94m/�[93m:�[94m/�[93m �[94m �[93m\�[94m:�[93m\�[94m �[0m
�[93m �[94m/�[93m_�[94m_�[93m/�[94m:�[93m:�[94m:�[93m:�[94m:�[93m:�[94m:�[93m:�[94m\�[93m �[94m\�[93m_�[94m_�[93m\�[94m/�[93m\�[94m:�[93m\�[94m_�[93m_�[94m �[93m �[94m �[93m/�[94m �[93m �[94m/�[93m:�[94m:�[93m\�[94m �[93m �[94m �[93m �[94m/�[93m_�[94m_�[93m/�[94m:�[93m/�[94m �[93m/�[94m:�[93m/�[94m_�[93m_�[94m_�[93m �[94m/�[93m_�[94m_�[93m/�[94m:�[93m/�[94m �[93m\�[94m_�[93m_�[94m\�[93m:�[94m\�[0m
�[93m �[94m\�[93m �[94m �[93m\�[94m:�[93m\�[94m�[93m�[94m\�[93m�[94m�[93m\�[94m/�[93m �[94m �[93m �[94m �[93m\�[94m �[93m �[94m\�[93m:�[94m\�[93m/�[94m\�[93m �[94m/�[93m_�[94m_�[93m/�[94m:�[93m/�[94m\�[93m:�[94m\�[93m �[94m �[93m �[94m\�[93m �[94m �[93m\�[94m:�[93m\�[94m/�[93m:�[94m:�[93m:�[94m:�[93m:�[94m/�[93m �[94m\�[93m �[94m �[93m\�[94m:�[93m\�[94m �[93m/�[94m �[93m �[94m/�[93m:�[94m/�[0m
�[93m �[94m �[93m\�[94m �[93m �[94m\�[93m:�[94m\�[93m �[94m �[93m�[94m�[93m�[94m �[93m �[94m �[93m �[94m �[93m �[94m\�[93m_�[94m_�[93m\�[94m:�[93m:�[94m/�[93m �[94m\�[93m_�[94m_�[93m\�[94m/�[93m �[94m �[93m\�[94m:�[93m\�[94m �[93m �[94m �[93m\�[94m �[93m �[94m\�[93m:�[94m:�[93m/�[94m�[93m�[94m�[93m~�[94m �[93m �[94m �[93m\�[94m �[93m �[94m\�[93m:�[94m\�[93m �[94m �[93m/�[94m:�[93m/�[94m �[0m
�[93m �[94m �[93m �[94m\�[93m �[94m �[93m\�[94m:�[93m\�[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m/�[93m_�[94m_�[93m/�[94m:�[93m/�[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m\�[94m �[93m �[94m\�[93m:�[94m\�[93m �[94m �[93m �[94m\�[93m �[94m �[93m\�[94m:�[93m\�[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m\�[93m �[94m �[93m\�[94m:�[93m\�[94m/�[93m:�[94m/�[93m �[94m �[0m
�[93m �[94m �[93m �[94m �[93m\�[94m �[93m �[94m\�[93m:�[94m\�[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m\�[93m_�[94m_�[93m\�[94m/�[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m\�[93m_�[94m_�[93m\�[94m/�[93m �[94m �[93m �[94m �[93m\�[94m �[93m �[94m\�[93m:�[94m\�[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m\�[94m �[93m �[94m\�[93m:�[94m:�[93m/�[94m �[93m �[94m �[0m
�[93m �[94m �[93m �[94m �[93m �[94m\�[93m_�[94m_�[93m\�[94m/�[93m �[94m �[93m �[94m �[93m �[
2024-01-22T23:22:41.297Z [NITRO]::Debug: Loading model with params {"llama_model_path":"C:\Users\Абв\jan\models\tinyllama-1.1b\tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf","ctx_len":2048,"prompt_template":"<|system|>\n{system_message}<|user|>\n{prompt}<|assistant|>","system_prompt":"<|system|>\n","user_prompt":"<|user|>\n","ai_prompt":"<|assistant|>","cpu_threads":12}
2024-01-22T23:22:41.333Z [NITRO]::Error: ggml_init_cublas: GGML_CUDA_FORCE_MMQ: no
ggml_init_cublas: CUDA_USE_TENSOR_CORES: yes
ggml_init_cublas: found 1 CUDA devices:
Device 0: NVIDIA GeForce RTX 3060, compute capability 8.6, VMM: yes

2024-01-22T23:22:41.445Z [NITRO]::Debug: 94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m\�[93m_�[94m_�[93m\�[94m/�[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m\�[93m_�[94m_�[93m\�[94m/�[93m �[94m �[93m �[94m �[0m
�[0m20240122 23:22:41.085000 UTC 32044 INFO Nitro version: - main.cc:44
20240122 23:22:41.086000 UTC 32044 INFO Server started, listening at: 127.0.0.1:3928 - main.cc:48
20240122 23:22:41.086000 UTC 32044 INFO Please load your model - main.cc:49
20240122 23:22:41.086000 UTC 32044 INFO Number of thread is:12 - main.cc:52
20240122 23:22:41.174000 UTC 32044 INFO Not found models folder, start server as usual - llamaCPP.h:2510
20240122 23:22:41.299000 UTC 33752 INFO Setting up GGML CUBLAS PARAMS - llamaCPP.cc:462
{"timestamp":1705965761,"level":"INFO","function":"loadModelImpl","line":478,"message":"system info","n_threads":12,"total_threads":12,"system_info":"AVX = 1 | AVX_VNNI = 0 | AVX2 = 1 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | FMA = 1 | NEON = 0 | ARM_FMA = 0 | F16C = 1 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 1 | SSSE3 = 0 | VSX = 0 | "}

2024-01-22T23:22:41.451Z [NITRO]::Error: error loading model: failed to open C:\Users\Абв\jan\models\tinyllama-1.1b\tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf: No such file or directory
llama_load_model_from_file: failed to load model
llama_init_from_gpt_params: error: failed to load model 'C:\Users\Абв\jan\models\tinyllama-1.1b\tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf'

2024-01-22T23:22:41.451Z [NITRO]::Debug: {"timestamp":1705965761,"level":"ERROR","function":"load_model","line":557,"message":"unable to load model","model":"C:\Users\Абв\jan\models\tinyllama-1.1b\tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf"}
20240122 23:22:41.445000 UTC 33752 ERROR Error loading the model - llamaCPP.cc:482

2024-01-22T23:22:41.486Z [NITRO]::Debug: 20240122 23:22:41.486000 UTC 29104 INFO Resolved request for task_id:0 - llamaCPP.cc:289
20240122 23:22:41.486000 UTC 29104 ERROR Sending more than 1 response for request. Ignoring later response - HttpServer.cc:355

2024-01-22T23:22:45.859Z [NITRO]::Debug: Request to kill Nitro
`

@Glashkoff Glashkoff added the type: bug Something isn't working label Jan 22, 2024
@Van-QA
Copy link
Contributor

Van-QA commented Jan 26, 2024

@Glashkoff Please try this workaround approach with our experimental feature while we are resolving the issue:

  1. Install the nightly build of Jan (Jan v0.4.4-193 or higher)
  2. Move the Jan data folder to another location with no special charactor in the file path image
  3. Restart the app, install model and see if the issue resolved

@Glashkoff
Copy link
Author

Glashkoff commented Jan 26, 2024

@Van-QA thanks to the new option, it is now possible to change the dir to one that will contain only Latin symbols, and yes, then the model after move can be launched.
Edit: Removed some text because I misunderstood the purpose of this error workaround.

@Van-QA Van-QA added this to the v0.4.7 milestone Jan 27, 2024
@Van-QA Van-QA mentioned this issue Jan 29, 2024
@louis-jan louis-jan modified the milestones: v0.4.7, v0.4.8 Feb 19, 2024
@louis-jan
Copy link
Contributor

I tested with llama.cpp and encountered the same problem. It appears to be an issue with the engine, not the application. I will continue investigating further.

@0xSage
Copy link
Contributor

0xSage commented Sep 5, 2024

Closing, if issue occurs again, should be filed upstream in respective engines, e.g. llamacpp

@0xSage 0xSage closed this as completed Sep 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Archived in project
Development

No branches or pull requests

7 participants