Skip to content

Models are not isolated from each other #8937

@imkira

Description

@imkira

LocalAI version: 3.12.1

Environment, CPU architecture, OS, and Version: Linux

Describe the bug

I've noticed that because Qwen3.5-4B-GGUF and Qwen3.5-9B-GGUF share the same file mmproj/mmproj-F32.gguf but the files are different files, they get overwritten by LocalAI when you install them and so only the last file remains.

To Reproduce

  1. install unsloth/Qwen3.5-9B-GGUF
  2. Make sure 9B has mmproj: llama-cpp/mmproj/mmproj-F32.gguf
  3. Initiate a conversation with 9B
  4. Conversation works
  5. install unsloth/Qwen3.5-4B-GGUF
  6. Initiate a conversation with 9B
  7. Conversation fails

Expected behavior

Conversation works.

Logs

Additional context

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    Status

    Done

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions