Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AMD Radeon RX 6900 XT on Linux fails with --gpu amd flag #214

Open
johnshaughnessy opened this issue Jan 17, 2024 · 2 comments
Open

AMD Radeon RX 6900 XT on Linux fails with --gpu amd flag #214

johnshaughnessy opened this issue Jan 17, 2024 · 2 comments

Comments

@johnshaughnessy
Copy link

johnshaughnessy commented Jan 17, 2024

I have not been able to get llamafile --gpu amd working with an AMD Radeon RX 6900 XT on linux. The relevant line of the log seems to be:

llamafile: /usr/src/debug/hip-runtime-amd/clr-rocm-5.7.1/rocclr/os/os_posix.cpp:310: static void amd::Os::currentStackInfo(unsigned char**, size_t*): Assertion `Os::currentStackPtr() >= *base - *size && Os::currentStackPtr() < *base && "just checking"' failed.

I found similar bug reports in other projects, so I suspect this is NOT a llamafile bug:

Instead, it seems that ROCm is not supported for my graphics card on linux:

  Name:                    gfx1030
  Marketing Name:          AMD Radeon RX 6900 XT

Searching the AMD docs, I found:

I tried messing with the environment variable HSA_OVERRIDE_GFX_VERSION because I had seen that in some other issue reports, but did not have any luck.

In case it's helpful, I kept a log the steps I took when setting things up.

To summarize, I installed ROCm for Arch Linux, but it seems that my graphics card (Radeon RX 6900 XT) is not supported by ROCm on linux, so I cannot use the --gpu amd flag with llamafile.

If this is correct, then it is not a bug with llamafile. Still, I wanted to file this issue:

  • to ask if this seems correct,
  • to ask if there's anything else worth trying before giving up on my card,
  • to save anyone else the trouble of figuring this out,
  • to offer to make a note in the Gotchas section of the README.md.
@AwesomeApple12
Copy link

Try ROCm 5.7.1, I think 6.0.0 is too new for your GPU. Also "export HSA_OVERRIDE_GFX_VERSION=10.3.0" should work just fine on your GPU. Try using the nightly pytorch.

pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/rocm5.7

@hiepxanh
Copy link

image
install windows and problem solve @johnshaughnessy

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants