Skip to content

Misc. bug: Ryzen 395 AI Max, vulkan is seeing vram from system memory and not dedicated gpu memory #16832

@Goldenkoron

Description

@Goldenkoron

Name and Version

Version: llama-b6869-bin-win-vulkan-x64 (and other recent versions at least)

When UMA is set to 64gb GPU memory, Llama-cpp sees 64GB, however when UMA is set to 96gb for iGPU, Llama-cpp can only see and use 32gb.

Operating systems

Windows

Which llama.cpp modules do you know to be affected?

No response

Command line

Problem description & steps to reproduce

With a Ryzen 395 AI Max on windows, test the command 'llama-server --devices-list' and see how much vram it shows for Radeon 8060S.

First Bad Commit

No response

Relevant log output

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions